You are here

Harnessing Generative AI’s Potential in Extension while Ensuring Ethical Use

Publication Number: P4112
View as PDF: P4112.pdf

Introduction

Generative artificial intelligence (AI), such as Microsoft Copilot, ChatGPT, or Perplexity, is an AI that creates textual, video, audio, and image content as if done by humans. Despite its potential to increase efficiency, accessibility, and creativity in Extension work, particularly in planning and evaluating Extension programs, generative AI could also pose ethical challenges, such as bias, inaccurate information, and plagiarism, to Extension professionals. For this reason, Extension professionals must ensure ethical and responsible use of generative AI tools to maintain integrity while harnessing generative AI’s potential. This publication highlights the benefits, ethical issues, and best practices for using generative AI tools in Extension.

Ways Generative AI Can Benefit Extension

Efficient Extension service

Extension professionals can use generative AI tools to draft program materials, generate lesson plans, and summarize research findings relatively faster than when done manually. This would allow Extension professionals to save time which could then be invested in tasks that require Extension professionals’ physical presence, such as engaging with communities and delivering Extension programs.

Increased access to Extension programs

By using generative AI tools like Perplexity to translate and convert text to audio, Extension professionals can prepare program materials in different forms and languages that can serve multiple audiences. For instance, Extension professionals can use generative AI to translate a program on composting into an audio version for clients including individuals who may be visually impaired or have low literacy levels. This enables them to access the resources, enhancing Extension program participation and reach.

Creative Extension program delivery

Generative AI tools can assist Extension professionals in brainstorming new program ideas and preparing more engaging activities that can hold the interest and attention of participants throughout program delivery. For example, Extension professionals can use generative AI to identify ideas for new youth programs on STEM education and hands-on activities that engage participants, enhancing Extension professionals’ creativity in their work.

Ways Generative AI Can Pose Problems to Extension

Generative AI can leak confidential Extension information.

Generative AI can inadvertently leak confidential Extension information. When Extension professionals input confidential information, such as unpublished Extension program evaluation data, into public generative AI tools, the data becomes a part of the generative AI training data, which other users can access. For example, if a professional inputs unpublished Extension program evaluation data into ChatGPT to analyze data, the data becomes exposed and can be accessed by other users.

Extension professionals must ensure that all data or prompts they input into the public generative AI tools are regarded as public. However, Microsoft Copilot integrates with Microsoft 365 and follows the organization’s privacy policies, compliance, and security.

Generative AI content can be inaccurate and unreliable.

Sometimes, generative AI can generate content that is untrue (not based on fact) or doesn’t make sense, and this can mislead professionals. Imagine what would happen if Extension professionals used generative AI to generate fact sheets about specific agricultural practices that are not true and share that information with farmers! Professionals would misinform farmers who rely on them, resulting in incorrect farm practices, which can cause low productivity and financial loss for farmers. As a result, Extension programs might lose credibility and fail to make a positive impact.

There may be issues of plagiarism and academic integrity.

Plagiarism is a practice in academia where people take the ideas or works of others and present them as their own without acknowledging the original owners of the work or the ideas. Academic integrity, on the other hand, is when one is fair, honest, and respectful of others’ works or ideas.

Some generative AI tools can produce content without acknowledging the source of information in the content produced. If Extension professionals use such content for any purpose—say to plan programs and develop evaluation plans—without acknowledging the sources, or without acknowledging the generative AI tool that was used to generate the content and how it helped them perform those tasks, it constitutes plagiarism and academic dishonesty.

Guidelines for Ethical Use of Generative AI by Extension Professionals

For Extension professionals to use generative AI ethically, consider the following:

  • Transparency is key. Extension professionals must always indicate that they used generative AI and state how they used it by explaining how generative AI assisted them. Should Extension professionals use AI to draft an agricultural-related program proposal, for example, they should make this clear by writing in their proposal that “Portions of this document were generated with AI assistance and verified by agricultural experts to ensure accuracy.”
  • Use expertise to review content produced by generative AI. Extension professionals must read and edit all content generated by generative AI to ensure that it is accurate, relevant, and reliable. For example, if Extension professionals use generative AI to analyze survey data, they must carefully review the results produced by the tool to ensure accuracy.
  • Ensure data privacy. Extension professionals often collect sensitive information from participants, such as income, farm size, or household size, during program development or delivery. As such, professionals must protect this information. To do this, Extension professionals must adhere to strict data privacy guidelines when they want to input documents containing such sensitive information into a generative AI tool for analysis or to draw insight. For example, when Extension professionals use generative AI to analyze survey data, they should remove all personal or demographic information that could identify participants before inputting the data into the generative AI tool. They should also comply with all relevant data privacy policies to protect clients, keeping collected data safe and secure.

How Extension Professionals Can Use Generative AI Responsibly

Responsible use of generative AI means approaching and using generative AI in a way that maximizes its benefits, complies with ethical guidelines, and prevents any negative consequences like inaccuracy. For Extension professionals to ensure responsible use, they must:

  • Start small. Extension professionals are encouraged to use generative AI to perform simple tasks like brainstorming ideas for developing curricula, planning a program, or summarizing short evaluation and/or research reports. This approach allows professionals to gain a deeper understanding of generative AI over time, enabling them to perform more complex and sensitive tasks like conducting needs assessments and uploading documents to gain insights.
  • Seek training. Regular training is important for Extension professionals to develop the right skills, gain knowledge, and develop a good attitude towards using generative AI tools effectively and responsibly. Workshops, webinars, or conferences are great sources for generative AI training.
  • Be ethical. Extension professionals should adhere to written guidelines for using generative AI to ensure they are as ethical as required (ensuring data privacy, accuracy, academic integrity, and avoiding plagiarism) when using generative AI. This will help ensure integrity in Extension work. A good source of ethical guidelines for MSU Extension professionals is the Generative AI Guidelines for MSU.
  • Be up to date. Because generative AI is evolving every day with new features improving previous versions, Extension professionals should, from time to time, seek new information about generative AI from trusted sources like university websites, MSU Extension’s website, and journal articles. This would ensure that professionals know about the latest features, best practices/guidelines, and new skills required for using generative AI effectively and responsibly.

Conclusion

While generative AI has great potential to enhance efficiency, accessibility, and creativity in Extension in general and in Extension program planning and evaluation, unethical and irresponsible use of generative AI could hurt the accuracy and integrity of the Cooperative Extension Service.

Therefore, Extension professionals must adhere to ethical guidelines, use best practices, seek new information about generative AI to stay up to date, adapt to changing features of generative AI, and prioritize human oversight to minimize the risk of irresponsible and unethical use of generative AI and the resulting consequences.

For more information, please refer to MSU Extension Publication 4061 Integrating Generative AI tools into Extension Program Planning and Evaluation online at extension.msstate.edu and Generative AI Guidelines for MSU.

References

AlAli, R., Wardat, Y., Al-Saud, K., & Alhayek, K. A. (2024). Generative AI in education: Best practices for successful implementation. International Journal of Religion, 5(9), 1016–1025.

Alier, M., García-Peñalvo, F., & Camba, J. D. (2024). Generative artificial intelligence in education: From deceptive to disruptive. International Journal of Interactive Multimedia and Artificial Intelligence.

Baltà‐Salvador, R., El‐Madafri, I., Brasó‐Vives, E., & Peña, M. (2025). Empowering engineering students through artificial intelligence (AI): Blended human–AI creative ideation processes with ChatGPT. Computer Applications in Engineering Education, 33(1).

Beltran, M. A., Ruiz Mondragon, M. I., & Han, S. H. (2024). Comparative analysis of generative AI risks in the public sector. Proceedings of the 25th Annual International Conference on Digital Government Research, 610–617.

Bolender, B., Vispoel, S., Converse, G., Koprowicz, N., Song, D., & Osaro, S. (2024). Generative AI in K12: Analytics from early adoption. Journal of Measurement and Evaluation in Education and Psychology, 15 (Special Issue), 361–377.

Hadi, M. U., Tashi, Q. A., Qureshi, R., Shah, A., Muneer, A., Irfan, M., Zafar, A., Shaikh, M. B., Akhtar, N., Wu, J., & Mirjalili, S. (n.d.). A survey on large language models: Applications, challenges, limitations, and practical usage. TechRxiv. 

Ijiga, A. C., Peace, A. E., Idoko, I. P., Agbo, D. O., Harry, K. D., Ezebuka, C. I., & Ukatu, I. E. (2024). Ethical considerations in implementing generative AI for healthcare supply chain optimization: A cross-country analysis across India, the United Kingdom, and the United States of America. International Journal of Biological and Pharmaceutical Sciences Archive, 7(01), 48–63.

Jamieson, S., & Howard, R. M. (2019). Rethinking the relationship between plagiarism and academic integrity. Revue Internationale Des Technologies En Pédagogie Universitaire, 16(2), 69–85.

Kamel, H. (2024). Understanding the impact of AI hallucinations on the university community. Cybrarians Journal, 73, 111–134. 

Mensah, E. A., & Osman, N. (2024). Integrating generative AI tools into Extension program planning and evaluation (Publication 4061). Mississippi State University Extension Service.

OpenAI. (2025). ChatGPT (Version 4.0) [Large Language Model].


The information given here is for educational purposes only. References to commercial products, trade names, or suppliers are made with the understanding that no endorsement is implied and that no discrimination against other products or suppliers is intended.

Publication 4112 (POD-04-25)

By Emmanuel Anobir Mensah, Graduate Student, Agricultural Science, and Nesma Osman, PhD, Assistant Professor, Human Sciences.

Print Friendly, PDF & Email

The Mississippi State University Extension Service is working to ensure all web content is accessible to all users. If you need assistance accessing any of our content, please email the webteam or call 662-325-2262.

Authors

Portrait of Dr. Nesma Osama Abdelrahm Osman
Assistant Professor

Your Extension Experts

Portrait of Dr. Donna Jean Peterson
Extension Professor & Program
Portrait of Dr. Nesma Osama Abdelrahm Osman
Assistant Professor
Portrait of Ms. Tia Merrell Gregory
Extension Instructor I