Skip links
digitalisation how chatgpt is not the best for internal company data

Why ChatGPT is not the answer to improving your company operations?

In the last week alone, I’ve had 5 conversations with our partners and prospects who mentioned ChatGPT. It’s a pattern now.

One Innovation Director even told me that his CEO started asking on how their company can utilise ChatGPT internally. Suddenly, the business world woke up to a technology that (in current effectiveness rates) has been around for a good couple of years now. Everybody wants to reap the benefits that this AI tool holds for business.

ChatGPT has the potential to automate many processes, and make machine-human-interactions smoother. However, as with any new technology, you need to carefully analyse the benefits it can bring to your business, with a big focus on the risks is presents. Just because ChatGPT exists, and offers a decent user experience, doesn’t mean you can (and should) use it for all aspects of your business.

What could go wrong?

ChatGPT is a great starting point (e.g. for helping with research, gathering ideas or supporting content creation) but it’s not a reliable tool when it comes to your internal company data.

For starter, the ChatGPT models are trained on databases from up to 2021. It doesn’t have access to newer information or real-time data, which can be an issue if you want to provide up-to-date, accurate information.

Just like Alexa or Siri won’t understand the nuances of your company specific language, ChatGPT will be equally helpless.

Then, without special training, ChatGPT will not have a built-in “I don’t know” response, meaning, much of its answers seem authoritative, but they are in fact a ‘hallucination’. It’ll answer any question very confidently, regardless of whether the answer is true or false.

Roughly speaking, the hallucination rate for ChatGPT is 15% to 20%, so 80% of the time, it does well, and 20% of the time, it makes up stuff. The key here is to find out when it is hallucinating, and make sure that you have an alternative answer or a response you deliver to the user, versus its hallucination.

The Internet is full of examples of ChatGPT going off the rails. The model will give you exquisitely written–and wrong–text about the record for walking across the English Channel on foot, why 2+2 equals 5, or will write a compelling essay about why mayonnaise is a racist condiment, if properly prompted.

So what’s a better alternative? A system which models are trained on your internal data. And that’s what makes systems like Untrite much more effective for internal operations like service intelligence, quoting by similarity or understanding market trends.

While OpenAI ChatGPT system generates content from whatever it deems suitable without providing its sources, Untrite Intelligence™ platform provides deep links to source internal information, so you know who wrote it and what is their expertise.

It is delightful to see that many leaders wish to understand it better and to implement AI NLP-based systems internally. We’ll be running a webinar on this subject matter in the next couple of weeks, so make sure to subscribe to Untrite’s LinkedIn page to be informed of the details.

Leave a comment

This website uses cookies to improve your web experience.
See our Privacy Policy to learn more.