AI characters being used in a variety of settings is one day going to be the new norm. It's already happening in areas like customer service where bots can perform simple tasks based on user input.

While the website for this project is still in production and there is no whitepaper that explains the full extent of the technology, it is clear from the site that AI Charm is planning to use ChatGPT plug ins to give personality, including emotional intelligence, to AI characters to enable them to be used for various purposes such as interactive in-game advertising and customer consultations in stores. That all sounds fine, but my concern is that they also want to branch into areas such as psychological support and personal tuition. Generative AI is still in its infancy and is currently heavily biased, so relying on it to provide mental health support is not yet (and not ever in my opinion) a good idea. All it would take would be for wrong advice to be given once for it to be irreversibly damaging to an individual. I have similar concerns with AI personal tutors who could provide incorrect or biased opinions that could have a detrimental impact on a person's academic progress. Show Less

 3
2024 Cyrator - Crypto Research Community

Disclaimer: The content presented on this website, including any analyses, reviews, and ratings, is provided for informational purposes only and should not be considered financial advice. Cyrator does not endorse or recommend any financial transactions or investments based on the information available on this platform. Visitors to this site should perform their own due diligence and consult with a professional financial advisor before making any investment decisions. Cyrator is not liable for any actions taken, financial or otherwise, based on information or links from this website.