¡ÚLTIMAS HORAS! Disfruta de todo 1 año de Plus al 45% de dto ¡Lo quiero!
Data Talks
Podcast

Data Talks

23
0

Program Evaluation and Research Consulting

Program Evaluation and Research Consulting

23
0

The Power of Retrospective Pretests to Address Common Survey Research Challenges

Episode in Data Talks
James Pann interviews Melanie Hwalek, Ph.D., a program evaluation consultant, to discuss the retrospective pretest (RPT) design, focusing on its practical applications and the findings from her recent research detailed in the paper, “Designing a Questionnaire with Retrospective PrePost Items: Format Matters.” RPT is particularly useful for evaluating changes in participants’ perceptions or self-assessments following...
Children and education 1 year
0
0
0
38:09

Empowering Change: David Fetterman on Using Evaluation to Build a Better World

Episode in Data Talks
David Fetterman is a leading expert in empowerment evaluation, an approach that emphasizes collaboration, participation, and capacity building. He has written extensively on the topic, and his work has been used in a wide range of settings, including government agencies, non-profit organizations, and businesses. David’s work focuses on helping people evaluate their programs and initiatives...
Children and education 2 years
0
0
0
51:46

From Lecture Halls to Real-World Calls: Tiffany Berry’s Evaluation Insights

Episode in Data Talks
Whether you’re an educator, a student, or simply someone passionate about youth development and educational programs, this podcast episode with Tiffany Berry, Ph.D., promises to give you insights into the complex world of evaluation. She is the Dean and a full research professor in the Division of Behavioral & Organizational Sciences at Claremont Graduate University,...
Children and education 2 years
0
0
0
59:35

Mindfulness Meets Evaluation: Insights from Jim McDavid

Episode in Data Talks
In this episode, I talk with Jim McDavid, Ph.D., about his experience with mindfulness and meditation practice, how it has influenced him, and how it affects how he views and practices evaluation. Our conversation also covers practical wisdom, Jim’s interest in the environment, and challenges associated with determining cause and effect in evaluation. Jim is...
Children and education 2 years
0
0
0
01:02:46

Maximize Your Survey Response Rates: Expert Insights from Sheila Robinson

Episode in Data Talks
In this podcast episode, James Pann, Ph.D., interviews Sheila Robinson, Ed.D., about the topic of surveys and response rates. We focus on the significance of response rates in surveys and the steps that can be taken to maximize them. Sheila is a career educator and professional learning designer with experience in K-12 public education and...
Children and education 3 years
0
0
0
34:12

Young Adults With Cancer Learn About Mindfulness & Connection During Nature Treks with David Victorson

Episode in Data Talks
James Pann, Ph.D., interviews David Victorson, Ph.D., of True North Treks, a nonprofit organization whose mission is to empower young adults and caregivers affected by cancer to “find direction through connection” and mindfulness. As a child, David grew up surrounded by nature and its many restorative benefits. Therefore, when he went on to complete his postdoctoral fellowship in psychosocial oncology as a psychologist, he saw an opportunity to bring nature’s gifts to the young adult cancer patients he was seeing. In 2008, he co-founded True North Treks to fill some of the unmet needs of these cancer survivors and their caretakers and help them get their lives back on track. The reconnecting power of nature, coupled with mindfulness and meditation laid the basis for these restorative journeys. David goes on to discuss one of the most reported unmet needs: isolation. Many of the young cancer patients/survivors feel like they don’t know anybody like them. These treks allow the opportunity for deep social connection with others going through the same or similar experiences. These needs and solutions developed into three key points. True North Treks 3 Crucial Connections 1) Connection with nature (after going through something as unnatural as cancer treatment); 2) Connection with peers who get it and have walked a similar path; 3) Connection with oneself through mindful awareness practices, such as meditation and yoga. While it may sound like a therapy session at first, David emphasizes the lack of an explicit group therapy aspect. The guides are trained never to question the participants about their cancer and instead simply sit back and allow them to speak their minds. Often, the participants will immediately start talking about their cancer experience on their own. The guides, primarily mental health professionals, are taught to be themselves and simply bring mindfulness coaching. The participants benefit from the mindfulness and yoga experience and being with each other in the outdoors. That said, a “therapeutic” aspect tends to emerge on its own when the participants find themselves with several others just like themselves. Being one of the 3 Crucial Connections, David defines what mindfulness means on the treks. He states that it is simply the act of tuning into our present moment experience with qualities of openness, curiosity, and self-kindness. Beyond the definition, David and the coaches try to get participants to practice mindfulness regularly throughout the treks. This could start simply by observing/noticing things, talking about experiences in a group setting, and sitting with uncomfortable emotions. David indicated that there are different outcomes among participants after trying mindfulness practices. Some may be completely open-minded, while others may be very skeptical. Some may bring mindfulness to the little things in life, like a cup of coffee, while others carry the new awareness to a more significant aspect of their life, like their cancer journey. Analysis and Outcomes of the Treks When he’s not on a trek, David is at Northwestern University, where he does outcomes research and other academic activities. This has helped him develop a more focused and practical study and outcome analysis of the treks. In a recently published study of True North Treks, they used a version of the Patient-Reported Outcomes Measurement Information System (PROMIS), which he helped to develop at Northwestern University. They were even able to utilize some of the participants’ blood samples to analyze the different levels of circulating inflammatory cytokines or other proxies for bodily stress, before and after the trek. They found that many who showed high levels of depression, anxiety and sleep issues at the beginning dropped to low or normal levels by the end of the treks. David finishes by describing that they have not done a longitudinal impact study to see how many participants continue the mindfulness/yoga practices but, with these treks, they have an introduction to the practices and tools to continuing on their own. It’s no longer a foreign concept or practice and with the developed connections with other participants, many appear to continue to benefit from the trek experience. If you or someone you know could benefit from one of these treks, you can enroll here. In addition, you can donate or fill out a volunteer interest form here. To reach David directly, email him at davidvictorson@truenorthtreks.org. Timeline00:00 – Introduction / How True North Treks came about 04:02 – The 3 Crucial Connections 05:14 – What happens on the trips 09:40 – Mindfulness and how it’s taught on the treks 13:06 – The activities that might take place during the retreat / Where mindfulness moments might be found 13:47 – The benefits and effects of mindfulness for young cancer survivors 15:57 – The effect mindfulness plays in all unpleasant areas of life, cancer and non-cancer related 18:02 – What does “sitting with something” mean / ‘The uninvited guest’ 20:26 – The outcomes that True North Treks have been able to improve 22:51 – The extent to which the benefits of mindfulness and yoga taught on the trips last 23:52 – The ways that True North Trek facilitates the continued support group after the treks 25:03 – The extent to which participants carry out the practices on their own after the retreat 26:10 – COVID’s long-term effects on the organization 28:55 – True North Trek’s approach at funding and fundraising 33:20 – David shares a lesson he learned working with a non-profit organization 35:34 – A book he likes to give as a gift to friends and colleagues 38:10 – Favorite Rumi quotes shared Episode LinksTrue North Treks  Donate to True North Treks Trek Participant Application Become a Volunteer at TNT David’s Linkedin David’s Twitter Connect with JamesSubscribe to YouTube channel LinkedIn Twitter Please reach out with comments and questions. Thanks! The post Young Adults With Cancer Learn About Mindfulness & Connection During Nature Treks with David Victorson appeared first on James Pann.
Children and education 3 years
0
0
0
40:54

The CIPP Evaluation Model with Guili Zhang

Episode in Data Talks
James Pann interviews Guili Zhang about the Context, Input, Process, and Product (CIPP) evaluation model and other evaluation related areas. Dr. Zhang is Department Chair and Professor of Research and Evaluation at East Carolina University. She received a Ph.D. in Research and Evaluation Methodology from the University of Florida and postdoctoral advanced training in large scale data analysis from Stanford University.   She has presented and published extensively, and led the evaluation of many programs and projects, funded by agencies such as the U.S. Department of Education and the National Science Foundation. Guili’s book, The CIPP Evaluation Model, coauthored with Daniel Stufflebeam, is the authoritative book on the CIPP Model, one of the most influential and widely used evaluation frameworks.   Dr. Zhang is very active in the American Evaluation Association and is currently a Board of Directors Member-at-Large. Timeline00:00 – Introduction 00:18 – How she got involved with the CIPP model 01:43 – When she sent her evaluation report that used the CIPP model to Dan Stufflebeam 02:38 – Dan asks Guili to write a book about the CIPP model with him 05:53 – Collaborating on the writing of the book at a distance 06:46 – Guili’s concise explanation of the CIPP model 09:01 – CIPP model as an effective way to teach about evaluation in general 10:30 – Unique advantages of using the CIPP model 12:20 – A common sense approach to evaluation that can be used by many 13:21 – CIPP model as a living, evolving framework 14:44 – Updated CIPP model related checklists linked to in the book 15:56 – What non-evaluation students can bring to their future work by learning evaluation 18:22 – The best way to learn how to do evaluation 19:35 – Evaluation resources she suggests 21:50 – How evaluation can be used to improve our world 23:41 – What Guili would like to accomplish while an AEA board member 25:28 – Books she likes to give as a gift to friends and colleagues Episode LinksThe CIPP Evaluation Model: How to Evaluate for Improvement and Accountability.  Western Michigan University, The Evaluation Center  American Evaluation Association American Family Education Institute Guili’s Linkedin Guili’s Twitter Connect with JamesSubscribe to YouTube channel LinkedIn Twitter   Please reach out with comments and questions. Thanks! The post The CIPP Evaluation Model with Guili Zhang appeared first on James Pann.
Children and education 4 years
0
0
0
31:55

What’s the difference between research and evaluation? with Dana Wanzer

Episode in Data Talks
The difference between research and evaluation, the pros and cons of professionalization, the definition of evaluation, and other evaluation related topics with Dana Wanzer, Ph.D., interviewed by James Pann, Ph.D. Dana is an assistant professor of psychology in evaluation in the psychology department at the University of Wisconsin at Stout.  Dana teaches evaluation courses to students in the MS in Applied Psychology program, as well as statistics and intro psychology. Her research focuses on the evaluation profession, including defining evaluation, data visualization practices in evaluation, the role of politics in evaluation, and more. “In one regard, that I think our research is pointing towards is, we need to professionalize to better communicate to others outside of our field, who we are, what we do, and how we are supposed to do this work. Because if we don’t have those professional boundaries, then I mean, and we see this all the time, then funders get to dictate how evaluation is done. And it doesn’t always align with our set of competencies, our ethical guiding principles, the personal frameworks and approaches that we use in evaluation, they’ll just say, no, this is what we expect. And that may not align with what you want to do as an evaluator or what you think is maybe even ethical as an evaluator.” Outline00:00 – Dana’s study on the difference between research and evaluation as perceived by evaluators and educational researchers 01:01 – Why she decided to study this topic 04:10 – The methodology of her research 07:19 – What she found in her study 10:26 – Professionalization in the field of evaluation 12:14 – Common misunderstandings about what evaluation is and the impact 16:32 – Dana’s explanation of evaluation depends on who she is talking to 20:30 – The benefit of evaluators having subject matter expertise related to the evaluand 21:25 – Agenda differences between evaluators and researchers 24:53 – Do we need a better word than “evaluation” to describe what we do? 28:42 – How Dana would convince a researcher that evaluation is different from research 30:44 – Evaluation theories she uses 33:19 – Social science theories she utilizes 36:22 – What she recommends to students who want to learn about evaluation 38:45 – How she hopes evaluation will benefit students who are not primarily focused on evaluation 42:08 – How mindfulness can support evaluation education and practice 48:49 – Resources for beginning evaluators 53:38 – How to connect with Dana Episode LinksDana’s website Dana’s Twitter Dana’s LinkedIn Evaluland Podcast Dana’s October 2020 American Journal of Evaluation article Podcast infoPodcast website Apple Podcast Connect with JamesSubscribe to YouTube channel LinkedIn Twitter   Please reach out with comments and questions. Thanks! The post What’s the difference between research and evaluation? with Dana Wanzer appeared first on James Pann.
Children and education 4 years
0
0
0
56:27

The Importance of How We Define Evaluation with Amy Gullickson

Episode in Data Talks
In this interview, I speak with Amy Gullickson, acting Co-Director and Senior Lecturer at the Centre for Program Evaluation at Melbourne Graduate School of Education. She is also Chair of the International Society for Evaluation Education. We talk about what is the best definition of evaluation and why it is important to have a clear definition. Amy also gives us some of her specific resources for people just starting to learn evaluation. Listen to the podcast episode here:   What is Amy’s definition of evaluation? Amy explained that it’s important for us to think about the implications of the definition. She does that in detail in her article titled, The Whole Elephant: Defining Evaluation. She indicates that evaluation is the generation of a credible and systematic determination of merit, worth, and/or significance of an object through the application of defensible criteria and standards to demonstrably relevant empirical facts. Amy states that it is the implications of the definitions that are important – it’s worth exploring what you (or your clients, or stakeholders) think evaluation is. That will shape what they expect you to deliver, and what may or may not be appropriate. Amy believes a definition of evaluation must include valuation. This is our task as evaluators and has been overshadowed by social science research. We’ve got much work to do to become as informed (and have as much empirical evidence about what good looks like) in our valuation practice as we are in our research practice.     Why Amy thinks it’s important to have a clear definition of evaluation People often think evaluation and research are the same things. Amy talks to me about why it is clear to understand the difference and have a clear definition. Amy gives an example, if you are trying to find the value of p (probability of a type I error), how big was the change? But evaluation asks, “so what?” Did it actually reach the people that are most important? Was it big enough to make a difference? Does that p value actually mean anything? The task defines the knowledge, skills, and attributes that are necessary to accomplish it. If evaluation is just applied social science, then there’s no need to have skills and knowledge related to valuation. Amy thinks this is a significant flaw in common evaluation practice. You might not get to summative judgment every time (and for good reasons- it might not be appropriate to do so), but if we take the valuation process out of the definition, then we are allowing the implicit values of the most powerful to determine what good is, what evidence is. Then we become complicit in upholding systems that oppress the global majority, in effect, giving our blessing to programs and systems that actually create harm. Amy explains this is exactly counter to what most people say they aspire to when they engage in evaluation.     How Amy believes evaluator competencies relate to how someone might define evaluation Most competency sets have more than 60 competencies (Amy tells us the Australian Evaluation Society has 94). Canada has decided that anyone who can demonstrate an acceptable level of skill on a percentage in each domain can be credentialed as an evaluator. But are all competencies equally important? Are all competencies equally unique to evaluation? What we emphasize in terms of training or educating evaluators rests on the competencies that are essential to its practice. What makes an evaluation different than social science research? Valuation. What makes evaluators different from researchers, or general organizational consultants? The ability to provide explicit, clear, evidenced reasoning for valuation claims.     Amy’s thoughts on evaluation being a profession that requires licensing or credentials to practice Amy believes the profession of evaluation should require a license and credentials. But that will be a long time coming to the US. Voluntary organizations of Professional Evaluators (VOPEs) in smaller countries will do it first (e.g., Canada.) Evaluation is at the top end of the cognitive taxonomies (Bloom’s, SOLO) and it is deeply and intrinsically political. But because everyone does it to get dressed in the morning, we seem to think anyone can do it in their professional life. This seems fundamentally foolish and also increases the possibility that evaluation will do harm.     What students can bring to their future work of evaluation Amy indicated that criteria are always present in the collection of evidence and in decision making. Be the person who asks questions about what makes the organization, program, project, or system you’re working on or in good – and how would you know that? (with what evidence?) Learn how to recognize good evaluation and advocate for it in your workplace and community (i.e., know whether you need research and/or valuation, don’t conflate the two). Be an educated commissioner or consumer.      Lessons that inform your evaluation work What we need to be looking at is what are the lessons from other disciplines that we need to improve our work? Evaluation is a trans-discipline. We have a contribution to make to all the other disciplines in their pursuit of good, and the skills and knowledge they’ve developed can contribute to ours. Look at the competency sets – they cover all kinds of skills and knowledge that other disciplines have been studying and training people to do for years. Why would we stick with only social science research methods?     Improving the world with evaluation – tackling fake news, climate change, and social inequities Amy pointed to the importance of being explicit about what makes something – something, and what makes that something good – and asking questions about it. Genuine, naive curiosity in pursuit of answers to those questions can be powerful. Let’s take the pro-life movement in the US. What is meant by a pro-life candidate? That they don’t support abortion. A good candidate is one that believes every fetus should live. This implies that life = fetus. As many critics have discussed, if we assume a broader definition of “life”, it would probably encompass all human and non-human life. For humans, that would mean being “pro” things like health care (including mental health care), a living wage, decent housing, and protection from harm within or from the systems of government. For the planet, that would mean restoring habitat, natural systems like marshes, cutting fossil fuel dependency, and generally all the actions needed to reverse climate change. COVID-19 has kick-started us down that path by reducing pollution – can we maintain it?   Good evaluation asks us to check our definitions – 1. What are the boundaries we are setting around a thing (like “life” – what counts as life and what doesn’t?); and 2. What are the definitions of good for that thing, look for values and criteria – (e.g,  – how would we know that “life” is good for everyone?)     Mindfulness and Evaluation Amy has found that meditation is a good way to be able to take a research stance on her own functioning and get less caught up in the monkey mind and more able to keep a clear head. You’ve got to know your default functioning to be able to have any chance of avoiding it – and to use it to your advantage. Amy’s recommendations for people just starting to learn evaluation Favorite evaluation authors – both of them are practical, plain-speaking writers who understand valuation, reasoning, and research methods: Jane Davidson https://realevaluation.com/jane-davidson/ Wolfgang Beywl https://www.fhnw.ch/en/people/prof-phd-wolfgang-beywlgrab them (use DeepL to translate: https://www.deepl.com/translator). Read the New Zealand evaluation journal (Evaluation Matters) – it’s free and they’ve been working on the valuation task for longer than anyone else in English speaking countries (because Jane used to live there).  https://www.nzcer.org.nz/nzcerpress/evaluation-matters Pick up Mathea Roorda’s handbook on generating defensible criteria. Download it here: https://melbourneuni.au1.qualtrics.com/jfe/form/SV_6QokokiqGHEvjNz. Check out Better Evaluation https://www.betterevaluation.org/ Look for resources by EvaluATE: https://www.evalu-ate.org/ Do your origin story (see ISEE video from Vidhya Shanker: https://vimeo.com/476085190 ). Understand where you come from, the values you grew up with, and that were instilled in you by your life experience and your studies. Unless you can see them, you will be implicitly governed by them. Along with that, you could use the Bowen Family Systems Theory to explore your family history: https://thebowencenter.org/. The multi-generational family you come from is part of your origin story (even if you think it isn’t!) Join your local evaluation association. And join AEA (https://eval.org) – the student rates are cheap and they have heaps of resources. Create a community of critical evaluation friends Volunteer to work on evaluation projects to get some experience.  Special issues of journals related to evaluation education and values in evaluation: Evaluation and Program Planning https://www.sciencedirect.com/journal/evaluation-and-program-planning/special-issue/10RJ5T2S1BK The Canadian Journal of Program Evaluation has a special issue on evaluation education in print at the moment. Evaluation Journal of Australasia did two special issues in the last 12 months https://journals.sagepub.com/toc/evja/19/4; https://journals.sagepub.com/toc/evja/20/2 Connect with Amy Amy Gullickson on LinkedIn Twitter: @amyg4ce amy.gullickson@unimelb.edu.au The post The Importance of How We Define Evaluation with Amy Gullickson appeared first on James Pann.
Children and education 5 years
0
0
0
01:10:53

Evaluative Thinking with Thomas Archibald

Episode in Data Talks
Thomas Archibald, Ph.D. is an Associate Professor and Extension Specialist in the Department of Agricultural, Leadership, and Community Education at Virginia Tech. His practice and research are focused primarily on program evaluation and evaluation capacity building. He serves as the Chief of Party/Director of the Feed the Future Senegal Youth in Agriculture project, funded by USAID/Senegal, which is increasing youth engagement in Senegal’s economic growth. Tom is also an Associate Editor for Evaluation and Program Planning, Editorial Board Member of New Directions for Evaluation, Editorial Board Member of the American Journal of Evaluation, Board Member of the Eastern Evaluation Research Society (EERS), and Program Co-Chair of the American Evaluation Association (AEA) Organizational Learning and Evaluation Capacity Building Topical Interest Group. We discuss the following areas: – Evaluative thinking and how it relates to evaluation capacity building – How valuing is an essential part of evaluative thinking – What evaluation can teach other organizations not traditionally served by evaluation and what evaluators we can learn from them – Specific steps for evaluators to become more reflective in their practice – The importance of evaluative thinking, critical thinking, practical wisdom, reflective practice, intuition, and mindfulness – Social science theories that – How theories of power and an understanding power dynamics can inform evaluation – How the study of evaluative thinking and evaluation can assist individuals in a broad range of disciplines – Can evaluation save the world or does it at least have a role in improving it? – The future of evaluation: new and emerging approaches – Books that Tom highly recommends You can connect with Tom on LinkedIn or @tgarchibald on Twitter. Enjoy! The post Evaluative Thinking with Thomas Archibald appeared first on James Pann.
Children and education 5 years
0
0
0
57:47

Evaluation’s Role in Guiding Public Policy and Government with Rakesh Mohan

Episode in Data Talks
Rakesh Mohan has been the director of the Office of Performance Evaluations (OPE) an independent and nonpartisan agency of the Idaho State Legislature, since 2002. We cover important issues related to evaluation and how evaluation can guiding government and public policy.       Under Rakesh’s direction, the office has received many awards, including the American Evaluation Association’s 2016 Outstanding Evaluation Award and the 2011 Alva and Gunnar Myrdal Government Evaluation Award. He is also the recipient of the 2016 Donald and Alice Stone Outstanding Practitioner Award from the American Society for Public Administration (ASPA). Recently he wrote a chapter for the book Evaluation Failures, published by Sage (2019), which we discussed previously in the interview with Kylie Hutchinson. His chapter was titled, “I Didn’t Know I Would Be a Tightrope Walker Someday: Balancing Evaluator Responsiveness and Independence.”       He has also served on the American Evaluation Association’s board of directors, the editorial advisory board of New Directions for Evaluation, and the US Comptroller General’s Advisory Council on Government Auditing Standards.       Here are some of the topics we cover: – How Rakesh’s office provides guidance to the Idaho State Legislature – Keeping the primary stakeholders in mind is essential to conducting relevant evaluations and effective reports. Here’s an example of a one-page evaluation summary from his office: Child welfare system: Reducing the risk of adverse outcomes https://legislature.idaho.gov/ope/rep… – What he suggests to someone who wants to work in government-related evaluation – The main differences between his evaluation work and what most external evaluators do – Rakesh discusses an evaluation related mistake he made and what he learning from it – How he and his team reduce bias and improve clear thinking in their work Some of his favorite evaluation resources include: – BetterEvaluation: https://www.betterevaluation.org/ – Eleanor Chelimsky interview: https://www.youtube.com/watch?v=pF34R… – Eastern Evaluation Research Society: http://eers.org/ – Pew Research Center: https://www.pewresearch.org/ – US Government Accountability Office (GAO): https://www.gao.gov/ Thanks and enjoy! The post Evaluation’s Role in Guiding Public Policy and Government with Rakesh Mohan appeared first on James Pann.
Children and education 5 years
0
0
0
52:06

Ending Dusty Shelf Reports with Ann K. Emery

Episode in Data Talks
Ann K. Emery at Depict Data Studio helps people and organizations visualize data more effectively. This interview covers key evaluation and research reporting strategies, techniques, and resources. Ann is well-known in the evaluation world and beyond. She has been invited to speak in more than 30 states and 10 countries, more than 3,200 people have enrolled in her online training academy, and she has consulted more than 150 organizations, including the United Nations, Centers for Disease Control and Prevention, and Harvard University.   In this discussion, we talk about: – Why Ann loves data visualization and reporting – Reasons evaluators should be concerning about data visualization and reporting – Resources someone just starting out in evaluation (and others) can use to get proficient in data visualization. This includes Ann’s free course: Soar Beyond the Dusty Shelf Report – Using a DataViz Wall of Fame – Techniques for engaging project stakeholders in the reporting process. She co-authored an article on using Data Placemats that is available here – Common mistakes she sees evaluators make with their reports – How Ann obtains helpful feedback regarding her work – Her inspiration for developing blog posts – Books that Ann recommends – How she structures her day/week to enhance productivity Enjoy!     The post Ending Dusty Shelf Reports with Ann K. Emery appeared first on James Pann.
Children and education 5 years
0
0
0
01:10:34

Learning from Evaluation Mistakes with Kylie Hutchinson

Episode in Data Talks
Kylie is the 2020 recipient of the Canadian Evaluation Society’s Award for Contribution to Evaluation in Canada. She has spent the past thirty years working in the not-for-profit sector as a consultant, trainer, program manager, board member, and volunteer.  There are many valuable nuggets from this video. We discuss the following during the interview: – Reasons why Kylie loves evaluation. – Why evaluation is relevant for health and human services students who are not necessarily interested in research or evaluation. – Why she wrote Evaluation Failures: 22 Tales of Mistakes Made and Lessons Learned.  She highlights some of the most important  lessons from the book. – Kylie talks about how everyone makes mistakes and the key is learning from them. – She reviews specific strategies for engaging project stakeholders in an evaluation. – Kylie provides her favorite resources regarding program design. – How mindfulness factors into her work and how it can inform the work of evaluators. – Books she recommends. You can connect with Kylie through https://communitysolutions.ca or @EvaluationMaven at Twitter. Watch the video recording of our interview here: The post Learning from Evaluation Mistakes with Kylie Hutchinson appeared first on James Pann.
Children and education 5 years
0
0
0
33:51

Holly Rustick on Grant Writing Lessons Learned

Episode in Data Talks
James Pann, Ph.D., interviews Holly Rustick to get her insights into the grant writing and management from her years of experience doing this work, her work training others to do it, and her podcast interviews. We discuss:  – Common themes or lessons she has learned from the 120+ podcast episodes completed – Suggested first steps in writing a grant proposal – Frequent mistakes that grant writers make – Low hanging fruit (easy wins) in the proposal development process – Good resources for those starting to write grant proposals – How an organization should go about finding a grant writer Thanks and enjoy! Watch the video recording of our interview here: The post Holly Rustick on Grant Writing Lessons Learned appeared first on James Pann.
Children and education 5 years
0
0
0
43:01

Larry Martin, Ph.D. on Designing, Managing and Evaluating Programs

Episode in Data Talks
James Pann, Ph.D. interviews with Larry Martin, Ph.D. who is co-author of Designing & Managing Programs: An Effectiveness-Based Approach and a  professor at the University of Central Florida about program design and evaluation. OUTLINE: 00:00 – Introduction 00:22 – Background and Larry’s interest in program design and management 03:13 – Reasoning behind publishing the book 05:57 – How this book can be helpful for Master’s students in health and human services for their future careers 08:29 – Approaches to learning the material covered in the text 14:24 – Challenges and struggles students might encounter with the book and material 18:08 – Suggested resources regarding program design and management PODCAST LINKS: Designing & Managing Programs: An Effectiveness-Based Approach Larry’s webpage at the UCF College of Community Innovation and Education CONNECT WITH JAMES: Subscribe to this YouTube channel LinkedIn Twitter Please reach out with comments and feedback Thanks! The post Larry Martin, Ph.D. on Designing, Managing and Evaluating Programs appeared first on James Pann.
Children and education 5 years
0
0
0
33:49

Utilization-Focused Evaluation with Michael Quinn Patton

Episode in Data Talks
In this episode, I interview Michael Quinn Patton on Utilization-Focused Evaluation.       “What don’t you know, that if you did know, would make a difference to what you do.”       Utilization-Focused Evaluation with Michael Quinn Patton, Ph.D. “What don’t you know, that if you did know, would make a difference to what you do?” The importance of good questions and careful thought with Michael Quinn Patton, Ph.D., interview by James Pann, Ph.D. Michael Quinn Patton is the Founder and CEO of Utilization-Focused Evaluation, an independent organizational development and program evaluation organization. He has authored numerous books on evaluation, including Blue Marble Evaluation (2019), Principles-Focused Evaluation (2018), Facilitating Evaluation (2018), Developmental Evaluation (2010) and Utilization-Focused Evaluation (2008). During the interview Michael answers the following questions and addresses other areas:       OUTLINE: 00:00 – Introduction 02:10 – Definition of Utilization-Focused Evaluation (U-FE) 04:11 – How developmental evaluation, principles-focused evaluation, and blue marble evaluation fit in with U-FE 07:44 – What’s emphasized in U-FE that is not in other evaluation approaches 10:03 – How U-FE can be used in conjunction with other evaluation models 11:49 – Specific strategies and tactics for engaging project stakeholders in an evaluation 15:00 – Lessons from social sciences research literature that inform Michael’s evaluation work 17:35 – Common things that evaluators think they already know, but often don’t 18:45 – Best way to really learn how to do evaluations and recommendations for students and beginning evaluators 23:36 – Relationship between mindfulness and evaluation 26:52 – Resources to obtain familiarity with using U-FE 28:13 – Books that Michael likes to give as gifts 31:04 – How to connect with Michael       CONNECT WITH JAMES: – Subscribe to this YouTube channel – LinkedIn: https://www.linkedin.com/in/pannjames/ – Twitter: https://twitter.com/jpann Please reach out with comments and feedback! Thanks! Watch the video recording of our interview here: The post Utilization-Focused Evaluation with Michael Quinn Patton appeared first on James Pann.
Children and education 5 years
0
0
0
34:51

Empowerment Evaluation with David Fetterman, Ph.D.

Episode in Data Talks
In this episode, I interview David Fetterman about Empowerment Evaluation.     David Fetterman introduced… Continue reading Empowerment Evaluation with David Fetterman, Ph.D. The post Empowerment Evaluation with David Fetterman, Ph.D. appeared first on James Pann.
Children and education 5 years
0
0
0
47:56

Purpose-Driven Data Visualization with Alberto Cairo

Episode in Data Talks
ALBERTO CAIRO, PH.D., INTERVIEWED BY JAMES PANN, PH.D.     Alberto responds to the following… Continue reading Purpose-Driven Data Visualization with Alberto Cairo The post Purpose-Driven Data Visualization with Alberto Cairo appeared first on James Pann.
Children and education 5 years
0
0
0
37:33

Human-Centered Design Thinking Approach to Survey Development and Use with Sheila Robinson

Episode in Data Talks
Interview with Sheila Robinson Ed.D. about survey development and use by James Pann, Ph.D.  … Continue reading Human-Centered Design Thinking Approach to Survey Development and Use with Sheila Robinson The post Human-Centered Design Thinking Approach to Survey Development and Use with Sheila Robinson appeared first on James Pann.
Children and education 5 years
0
0
0
33:52

Data Talks with James Pann, Episode 4: Sameet Kumar - Data Talks

Episode in Data Talks
Data Talks with James Pann, Episode 4 is with Sameet Kumar, Ph.D., a psychologist at the Memorial Cancer Institute, serving the patients of the Memorial Healthcare System in south Broward County. We focus on Sameet’s use of data to make professional and personal decisions. Sameet has specialized in working with adults who have cancer as well as their caregivers and families for over 15 years. In addition to being a clinical psychologist, he has also studied with numerous Hindu and Buddhist teachers as part of his training. His professional interests include mindfulness meditation, resilience, well-being, and grief and bereavement. Sameet is the author of the bestselling Grieving Mindfully: A Compassionate and Spiritual Approach to Coping with Loss, and The Mindful Path Through Worry and Rumination, which is available in several languages. His most recent book is Mindfulness for Prolonged Grief. He can be followed on Twitter @sameetkumar, Facebook page Sameet Kumar, Ph.D., and dr.sameetkumar@gmail.com. The post Data Talks with James Pann, Episode 4: Sameet Kumar appeared first on EvalNetwork.
Children and education 10 years
0
0
0
46:26
You may also like View more
Audiocuentos disney Colección de Audio - cuentos Disney en castellano. Updated
Pequehistorias Queremos acercar a los niños al maravilloso mundo de los cuentos sonoros. El sonido es muy evocador y hace que los niños dejen volar su imaginación. En Pequehistorias Mumablue podrán encontrar cuentos de creación propia, recreaciones de los clásicos, cuentos "revisados", pequeños poemas... ¡Y un sinfín de historias asombrosas! Updated
Cuentos infantiles Recopilación de los mejores cuentos infantiles. Updated
Go to Children and education