As of this month, UNC faculty and staff can access a digital artificial intelligence tool to summarize articles, generate coursework and accelerate their online research.
Information Technology Services made a version of Microsoft Copilot — formerly known as Bing Chat Enterprise — available to UNC employees on Nov. 8. On Nov. 15, Microsoft announced that Bing Chat and Bing Chat Enterprise became Copilot.
While ITS still advises University employees to use caution when sharing information with any chatbot or AI tool, the University’s version of Microsoft Copilot is advertisement free and does not store or view users’ chats.
“The absolute most important, main thing is that we’re providing access to the faculty and staff in a way that gives them a partition — that is, the institutional partition — that has some protections beyond just using a commercially-available free tool,” said Michael Barker, vice chancellor of ITS.
Unlike OpenAI's popular generative chatbot ChatGPT, Microsoft Copilot can connect to apps used by the employee (such as Word, Excel and PowerPoint) and generate images based on text prompts.
Stan Ahalt, dean of the UNC School of Data Science and Society, said Microsoft Copilot is also more “grounded” than ChatGPT and it uses available, reliable data to constrain its responses and avoid factual errors.
Answers from Microsoft Copilot come with cited sources from the internet, which Barker said accompany a brief summary of the information it can find online to answer a user’s question. This method of presenting information makes internet research quicker and makes results more concise, though some users have found that factual errors appear in the summary component of some of these responses.
“The chat can produce hallucinations, can provide inaccuracies and can expose biases that are the content of what’s on the web and what it’s been trained on,” Barker said. “These are all improving over time, but those are some of the weaknesses, at least at present.”
A chatbot hallucination is a phenomenon where it "perceives patterns or objects that are nonexistent or imperceptible to human observers, creating outputs that are nonsensical or altogether inaccurate," according to IBM. The chatbot is, in essence, making something up.