In this paper, we discuss the challenges one faces when evaluating authoring systems developed to help people design visualization for communication purposes. We reflect on our own experiences in evaluating the visualization authoring systems that we have developed as well as the evaluation methods used in other recent projects. We also examine alternative approaches for evaluating visualization authoring systems that we believe to be more appropriate than traditional comparative studies. We hope that our discussion is informative, not only for researchers who intend to develop novel visualization authoring systems, but also for reviewers assigned to evaluate the research contributions of these systems. Our discussion concludes with opportunities for facilitating the evaluation and adoption of deployed visualization authoring systems.