Recommended AI Tool: Copilot-Chat
The official AI tool at UCF is Copilot-Chat (which is also known as Copilot-Web). The term “Copilot” is also used by Microsoft to refer to embedded AI in MS Office products, but the web-based chat tool is separate.
Copilot with Commercial Protection is NOT the same thing as “Copilot.” The latter is the public model of Microsoft’s LLM, also available on the web. Copilot with Commercial Protection (if logged in with a UCF NID) is a “walled garden” for UCF that offers several benefits:
- It searches the current Internet and is not limited to a fixed point in time when it was trained
- It uses GPT-4 (faster, better) without having to pay a premium
- It uses DALL-E 3.0 to generate images (right there inside CoPilot rather than on a different site)
- It provides a live Internet link to verify the information and confirm there was no hallucination
- Faculty, staff, and students log in with their NID
- Data stays local and is NOT uploaded to Microsoft or the public model version of Copilot. Inputs into Copilot with Commercial Protection are NOT added to the system’s memory, database, or future answers. However, per UCF retention policies, a local copy is kept in a hidden (local) mailbox folder in your own Sharepoint account for you to view or delete.
The safe version of UCF’s Copilot is accessed via this procedure:
- Start at the Copilot greeting page.
- If the site doesn’t recognize your UCF email, switch to the Edge browser.
- Click “sign in” at the top-right
- Select “work or school” for the type of account
- Type your full UCF email (including @ucf.edu) and click NEXT
- Log in with your NID and NID password. (Note: you may need to alter your SafeSearch settings away from “Strict”)
- You should see a checkmark in a green shield near the top. This is how you know you are in “data protection” mode.
Other AI Tools
The lists below are organized in a modified stoplight structure to indicate which products are safe (green), which require caution (yellow), and which cannot be used (red).
Green
Products in this category have been deemed safe by UCF IT, and may be used without reservation. Some are websites that search research databases, but do not pose a threat to UCF sensitive data. Most in this list (but not all) can be used for free.
- Adobe Firefly – this image generation AI is included for UCF faculty and staff through our Adobe Creative Cloud subscription. Students are not included in this subscription.
- Elicit – the freemium AI service helps you find relevant academic papers and provides summaries.
- Intelligent Course Search – Built in-house at UCF, ICS ingests all the information from published items in a Webcourses class, and allows students to query it for information. This would be especially useful if they want to pinpoint something they remember reading.
- Microsoft Office (Word, Excel, Outlook, Powerpoint) – Increasingly there are AI-assisted functions within MS Office products. These are all greenlit for use by UCF IT.
- Microsoft TEAMS AI – MS TEAMS uses AI to provide transcripts of meetings. UCF IT has authorized this functionality since it does not leave the UCF ecosystem.
- Research Rabbit – a standalone AI-powered website, Research Rabbit provides reliable network maps to quickly discover the seminal (and related) studies to any given academic topic.
- Scite – this website answers questions with citations and links, and helps researchers verify that published studies have not been retracted. There is no free version.
Note: students should not interpret “green” as permission to use these tools in every class. The color designation means that UCF has approved the technology behind these tools, but your instructor may not want them used in a particular class. Continue to consult the syllabus for specific permissions or restrictions.
Light Green
Microsoft uses the term “Copilot” to refer to several other products besides Copilot Chat. These additional tools are not free for UCF users, and must be paid for individually or by departments. However, they have been vetted and approved by UCF IT, and could be installed if you are willing to pay for them. These are the two most common tools to consider:
- M365 Copilot
- A custom AI tools that is “trained” on everything in your Onedrive and Outlook emails. You can ask it to find patterns, detect “needles in a haystack,” or analyze your own documents.
- A Copilot tool embeds itself in each product in MS-Office. You can click the Copilot icon within PowerPoint, for example, and ask it to assemble a presentation based off a Word document from your Onedrive.
- $400/year
- Copilot Studio
- Build your own agents with your own custom training
- Can analyze materials within
Other considerations:
- Personal funds cannot be used to pay for either of the enhanced Copilot licenses. A valid UCF funding source is required to pay through the IT department.
- Both tools only work when logged in to Windows with your work account
- Both tools are available for faculty or staff, but not available to students.
- Indicate your interest in purchasing either of these by opening a ServiceNow ticket.
Yellow
Products in this category are neither greenlit by UCF IT, nor are they banned. UCF stakeholders can independently decide to use these products, either the free or paid models. However, none of these products must be allowed to access sensitive UCF data, which includes you (the user) pasting or uploading such data. You must also be cautious about uploading someone else’s published material. The copyright belongs to the original author, and uploading their work is akin to publishing their content to the open Internet.
The products in this list are not supported or paid for by UCF. If you have questions about possible future site licenses, email parker.snelson@ucf.edu
These products are mostly phone apps or websites. Some of them offer deeper integration with your computer, but this is only possible if you complete UCF IT’s security review.
- Apple Intelligence – generative AI embedded within recent MacOS and iOS software
- Canva – a freemium online image creating/editing tool
- ChatGPT – the most well-known LLM, ChatGPT has several models. ChatGPT 3.5 is completely free, with no login necessary, but free logins are also offered. ChatGPT 4 is available with ChatGPT-Plus ($20/month). There are different sub-models within the GPT4 series; some can generate AI images. There is also a more-expensive ChatGPT-Pro subscriptions ($200/month). It is owned by OpenAI.
- Claude – the text-generating AI created by Anthropic (ex-employees of OpenAI). Like ChatGPT, it has different pricing plans (and a free version), and includes advanced tools such as Sonnet that belong to the Light Red category below.
- Consensus – the website provides an academic-only search engine, and includes both free and paid subscriptions.
- DALL-E – the image-generating AI created by OpenAI
- Deep Research – the name given to advanced research models by OpenAI and Google (separately). They are “targeted” (narrow) agents that identify and summarize research
- ExplainPaper – a freemium website, ExplainPaper allows users to upload a paper and highlight passages to obtain a simpler, jargon-free explanation of the concepts being discussed.
- Gemini – an LLM from Google (formerly known as Bard)
- Google Docs / Google Drive – their privacy policies allow third parties, including AI, to view (“scrape”) your data.
- Grammarly – this grammar-check software has both free and paid versions, and a browser extension. It may share data with third-party vendors.
- Grok – the generative AI product embedded within X (formerly Twitter)
- Llama – this LLM is Meta’s AI model.
- Midjourney – an image-generating website, Midjourney has no free plans, but is known for its realistic images.
- Napkin.ai – a freemium AI that transforms text into visuals such as charts, graphs, diagrams, flowcharts, and infographics.
- NotebookLM – this AI website from Google will analyze uploads of up to 50 PDFs, or URLs of YouTube videos, and generate a custom podcast to summarize the content. It also generates summaries and study guides.
- Perplexity – a privately-held LLM that is known for research queries, since it provides direct sources and quotations. It does not have a free version.
- Sora – a text-to-video generative AI from OpenAI, included with ChatGPT-Plus.
- Stable Diffusion – an image generating AI that runs on your own computer (with significant technical know-how), or via paid services.
Light Red
In this category are “agents” that complete tasks autonomously within your computer or within your browser. They are sometimes characterized as personal assistants, performing tasks like ordering a meal from a restaurant delivery service. This potentially gives them access to sensitive UCF data, if they have access to your files or browser settings. To use any of these products, you will need to complete IT’s security review before purchasing the software.
- Amazon Nova Act – this agent will complete in-browser tasks, which may include giving it access to your UCF login credentials.
- ChatGPT Operator – this agent promises to perform many tasks on your computer autonomously. It will be available with ChatGPT-Pro ($200/month).
- Claude for Computer Use – this agent from Anthropic can perform many tasks on your computer autonomously. Claude 3.5 Sonnet is one such example.
- Manus – this agent from China has not been evaluated yet for use at UCF.
Red
Products in this category have not been broadly approved by IT for use at UCF, often for reasons of privacy and security. For any questions, visit Information Security.
- AdobeAI – the license costs money, and the company reserves the right to share data with unknown third-party vendors.
- Deepseek – this AI from China has been banned at public institutions in Florida by the state legislature.
- Otter.ai – this service transcribes virtual meetings such as Zoom, and shares info with third parties.
- Read.ai – this transcription service has been blocked at UCF because of its aggressive method of auto-adding additional users.
- ZoomAI – this feature of Zoom does not guarantee privacy and UCF’s IT department will not enable it.
Faculty Center’s Curated List of AI Tools
There are several repositories that attempt to catalog all AI tools (futurepedia.io and theresanaiforthat stand out in particular), but we’ve been curating a smaller, more targeted list here.
AI Fluency
We recommend that faculty approach the AI revolution with the recognition that AI is here to stay and will represent a needed skill in the workplace of the future (or even the present!) As such, both faculty and students need to develop AI Fluency skills, which we define as:
- Understanding how AI works – knowing how LLMs operate will help users calibrate how much they should (mis)trust the output.
- Deciding when to use AI (and when not to) – AI is just another tool. In some circumstances users will get better results than a web-based search engine, but in other circumstances the reverse may be true. There are also moments when it may be unethical to use AI without disclosing the help.
- Valuing AI – a dispositional change such as this one is often overshadowed by outcomes favored by faculty on the cognitive side, yet true fluency with AI – especially the AI of the future – will require a favorable disposition to using AI. Thus, we owe it to students to recognize AI’s value.
- Applying effective prompt engineering methods – as the phrase goes, “garbage in, garbage out” applies when it comes to the kind of output AI creates. Good prompts give better results than lazy or ineffective prompts. Writing effective prompts is likely to remain a tool-specific skill, with different AI interfaces needing to be learned separately.
- Evaluating AI output – even today’s advanced AI tools can create hallucinations or contain factual mistakes. Employees in the workplace of the future – and thus our students today – need expertise in order to know how trustworthy the output is, and they need practice in fixing/finalizing the output, as this is surely how workplaces will use AI.
- Adding human value – things that can be automated by AI will, in fact, eventually become fully automated. But there will always be a need for human involvement for elements such as judgment, creativity, or emotional intelligence. Our students need to hone the skill of constantly seeking how humans add value to AI output. This includes sensing where (or when) the output could use human input, extrapolation, or interpretation, and then creating effective examples of them. Since this will be context-dependent, it’s not a single skill needed so much as a set of tools that enable our alumni to flourish alongside AI.
- Displaying digital adaptability – today’s AI tools will evolve, or may be replaced by completely different AI tools. Students and faculty need to be prepared for a lifetime of changing AI landscapes. They will need the mental dexterity and agility to accept these changes as inevitable, and the disposition to not fight against these tidal forces. The learning about AI, in other words, should be expected to last a lifetime.
Prompt Engineering
You can always just have a back-and-forth conversation with LLMs; they have a limited memory of what you’ve recently asked. However, it’s usually better to provide a detailed prompt to begin with.
You’ll want to approach this differently than you do Web search, where many people type a single phrase – almost a topic or a few nouns – rather than a full sentence.
A good prompt for an LLM is likely more than one full sentence. You want a lot of detail. Here are some ideas to consider:
- Persona – tell the LLM to role play as somebody
- Context – tell the LLM what you need overall, and why
- Task – tell the LLM exactly what you want
- Requirements – tell the LLM what the output should look like: format, length, level of sophistication, maybe even attitude
“60+ ChatGPT Assignments to Use in Your Classroom Today”
The Faculty Center staff assembled this open-source book to give you ideas about how to actually use AI in your assignments. It is free for anyone to use, and may be shared with others both inside and outside of UCF.
“50+ AI Hacks for Educators”
The Faculty Center staff wrote this second open-source book that provides ideas on how to use AI in your own life as a faculty member, including research and productivity. It is free for anyone to use, and may be shared with others both inside and outside of UCF.
AI Glossary
- Agents – a type of advanced AI that needs only be given a desired outcome, then it autonomously finds its own path and intermediate steps to accomplish the goal.
- AGI (Artificial General Intelligence) – a theoretical type of AI that has abilities indistinguishable from a human, and hypoethetically can pass the Turing test.
- Chatbot – this term is sometimes used as a synonym for public LLMs, but can also be used to refer to a custom-trained LLM generated by individual users, not the large company like OpenAI.
- GenAI (Generative AI) – a type of AI that “generates” an output, such as text or images. Large language models like ChatGPT are generative AI
- Generative Inbreeding – this term refers to AI being trained on previously-generated AI output, which may contain biases and hallucinations. These incorrect facts may be considered correct if they are prevalent enough, and the model becomes less reliable over time.
- LLM (Large Language Model) – a type of software / generative AI that accesses large databases it’s been trained on to predict the next logical word in a sentence, given the task/question it’s been given. Advanced models have excellent “perplexity” (plausibility in the word choice) and “burstiness” (variation of the sentences).
- Reasoning Models – advanced versions of LLMs that intentionally take longer to answer in order to provide better-reasoned answers. Ideal for knotty/thorny problems.
- Superintelligence – a hypothetical type of AI that far exceeds human intelligence.
Links
- Faculty Center’s page on teaching with AI and ideas for pedagogy.
- Aii. UCF’s Artificial Intelligence (Research) Initiative is an alliance of interdisciplinary AI researchers that are creating new knowledge in multiple branches of AI, including Computer Vision, Data Analytics, Machine Learning, Natural Language Processing (NLP), and Robotics.
- Office of Research CyberInfrastructure (ORCI). This office at UCF can help AI researchers connect to resources for high-performance computing, cloud services, data storage, APIs for LLMs or other AI tools, and other infrastructure-related questions associated with research. They offer consultations and workshops, as well, including some basic AI topics.
- IT’s list of official UCF software.