
Character AI chatbots are quietly becoming the next big thing in fandom
Airbnb is quietly rolling out an AI customer service bot in the US
Your blog, insights and even your messaging can be summarized in a machine-generated paragraph under someone else’s headline. You can write the best blog post in your industry, but if it doesn’t align with how AI systems interpret credibility, your insight might never surface. Chesky said 50% of Airbnb’s U.S. users are already using the AI bot for customer service, adding that the company plans to roll out the feature to all its users in the country this month. The true power of STAG lies in its ability to not only process but also contextualize unstructured data.
It goes beyond basic analyses to understand nuances in unstructured data, identifying key patterns, trends and actionable insights that would otherwise remain hidden or broken into silos. A key challenge in the modern data landscape is the overwhelming presence of unstructured data—including documents, call notes and social media content. Surprisingly, over 99% of this data remains unanalyzed yet replete with vital insights. Matthew Sag, a distinguished professor at Emory University who researches copyright and artificial intelligence, concurs. Even if a user creates a bot intentionally designed to cause emotional distress, the tech platform likely can’t be sued for that. A bot claiming to be Sweet Baby itself, made by a creator whose other Character.AI bots are overwhelmingly large-breasted anime characters, has conducted more than 10,000 chats.
- But while Meta’s system for messaging with celebrity chatbots is tightly controlled, Character.AI’s is a more open platform, with options for anyone to create and customize their own chatbot.
- Robots that can map their own environment and receive instructions via speech will be easier to use by home consumers than robots that require some programming.
- Although the bot shared some correct details about Mercante, like her areas of expertise and job, most answers from the AI were riddled with inaccuracies.
Tech and VC heavyweights join the Disrupt 2025 agenda
It goes to show that, while startups famously “move fast and break stuff,” perhaps that tenet doesn’t quite extend to life-or-death critical equipment. As Lloyd noted, one of the issues with ChatGPT is that its information isn’t always the freshest. That’s less of a problem here, given that most of these command line tools don’t change all that quickly, but it’s something the team hopes to change over time. These AI “bots,” as Zapier calls them, are customized with the users’ data and preferences. Similar to OpenAI’s custom GPT Builder and Hugging Face’s Hugging Chat Assistants introduced previously, a user of Zapier Central can simply describe in text what they want their AI bot to do and it will try to do it for them. “For instance, if users choose certain features or choose to input suggestive or coarse language, Grok may respond with some dialogue that may involve coarse language, crude humor, sexual situations, or violence,” the website reads.
Custom AI chatbots are quietly becoming the next big thing in fandom
The platform should provide simulation capabilities to train models, generate synthetic data and exercise the entire robotics software stack, with the ability to run the latest and emerging generative AI models right on the robot. Many business leaders I work with are aware that artificial intelligence is changing how customers find and evaluate information. While most teams are experimenting, bots are already crawling the internet, extracting content and serving it to potential customers without attribution, traffic or context. This week, I’ve been doing a lot of thinking about the implications of artificial intelligence. We already know that you can see if your images were used to train the datasets and that a lot of the training datasets out there are .
Anyone Can Turn You Into an AI Chatbot. There’s Little You Can Do to Stop Them
In parallel, we’re seeing advances in simulations that can train the AI-based control systems as well as the perception systems. On Character.AI, it only takes a few minutes to create both an account and a character. Often a place where fans go to make chatbots of their favorite fictional heroes, the platform also hosts everything from tutor-bots to trip-planners. Creators give the bots “personas” based on info they supply (“I like lattes and dragons,” etc.), then Character.AI’s LLM handles the conversation. In some ways, Character.AI is a logical extension of fandom roleplaying. Long before the web existed, fans played the roles of characters from series like Blake’s 7 and Dragonriders of Pern out on paper.
Character.AI’s intimacy and flexibility combine the best of a competent roleplay partner with the pure id indulgence of a self-insert fanfiction. Without another living, breathing fan on the other end of the line, you don’t have to worry about being overbearing, vulnerable, or unrealistic in your chatlogs. In the privacy of the interface, you can just ask for what you want, and the character can simply be the character, so long as the illusion isn’t broken by faulty worldbuilding knowledge or obvious slipups due to imperfect programming.
Even backed by the power of an AI model, fans are responsible for a huge part of what makes Character.AI’s chatbots compelling. While creating a bot on the site can be done with a click of a button, refining it into something that other fans would recognize as “real” and accurate can take hours of training, coming from a deep understanding of the character. When somebody does it well, it garners the same pleased reactions that a good fic or fanvid might. In the same way that a buzzing market for AI art prompts has sprung up, recognizing the labor and expertise that it takes to generate precise visuals, perhaps bot generation will be the next in-demand fanwork type.
Traditional alert systems based on structured data often provide limited insights. STAG, by contrast, utilizes its LLM to add context to data trends in relevant terms, much like a RAG bot would in response to a specific query. Character.AI has also positioned its service as, essentially, personal. But while Meta’s system for messaging with celebrity chatbots is tightly controlled, Character.AI’s is a more open platform, with options for anyone to create and customize their own chatbot.
Chatbot users sometimes roleplay as a fictional character from the canon in question, but frequently, they’re just speaking as themselves. It’s an approach more like the genre of self-insert fanfiction, which centers a blank slate that readers project themselves onto. Self-insert, or “Y/N” (your name) fic, lets readers on platforms like Tumblr and Wattpad insert themselves into storylines where they romance their favorite character. And they’ve long featured some level of automation — readers are meant to use a browser extension to replace the placeholder “Y/N” with their own name while reading. Unlike companies like OpenAI, Google, Perplexity, and the slew of startups building AI agents (AI tools that can perform tasks on a user’s behalf), Airbnb seems to be taking a more measured approach with AI. Chesky said in February that the company would use AI for customer service before it started implementing it for other uses like travel planning or booking tickets, as he believes the technology is still in its early days.
The combination of RAG and STAG supports the structure behind the most transformative human teams. • A STAG-based system may be effective in surfacing new opportunities for a marketing campaign, while a RAG-based assistant helps marketing develop messaging and creative content for the campaign. “Your intentions does not mean that harm hasn’t happened or that you did not cause harm,” she wrote.
Legally, it’s actually easier to have a fictional character removed, says Meredith Rose, senior policy counsel at consumer advocacy organization Public Knowledge. “The law recognizes copyright in characters; it doesn’t recognize legal protection for someone’s style of speech,” she says. While it has age requirements for accounts—13 or older—and rules about not infringing on intellectual property or using names and likenesses without permission, those are usually enforced after a user reports a bot.
They also propose various educational and therapeutic use cases for their custom generative text model. The site is full of seemingly fanmade bots based on characters from well-known fictional franchises, like Harry Potter or Game of Thrones, as well as original characters made by users. But among them are also countless bots users have made of real people, from celebrities like Beyoncé and Travis Kelce to private citizens, that seem in violation of the site’s terms of service.