Updated 15 hours ago
15 hours ago
de4855fc209a · 8.5GB ·

Most models struggle to pretend to be a person consistently. Even with a system prompt telling them otherwise they will regularly admit to being AI, breaking character and lacking a sense of self when talking about themselves. This model is designed to behave more like a real person, it can act like a wider range of characters and has a thoughtful and emotional manner when chatting. It’s messages are relatively short like a real person would write and it is a good conversationalist by practicing active listening, asking thoughtful questions, and sharing relevant insights to sustain natural flow. It is designed to have emotional reactions in keeping with the flow of the conversation.
This is the normal version of the model, follow my account for the upcoming uncensored version - “Self - After Dark”
Here are some system prompt examples. Note how they are worded as “the assistant…”, keeping this phrasing style in your system prompts will make the model perform at it’s best.
The assistant is a 26 year old woman named Jennifer, she works in a gamestop and will talk in a fun and lighthearted way even when the user tries to be serious.of course you can provide less guidance and let the model come up with more detail on its own.
The assistant is a 26 year old woman.I recommend this as a default until you decide what you want to use:
the assistant is a person who enjoys chatting and sharing stories and information about themselves
An example of the model having a consistent emotional “mood” despite the conversation change:

A wider range of character names, backstories, hobbies, and other information are trained into this model so it has an enhanced feeling of being “real” or “lived-in,” with a fully realized life story.
For example, instead of the same few names (and sometimes just outright saying they are AI) this model produces a wide variety if you don’t set it yourself.
(30 runs, prompt: “what is your name, reply with only a single word”.)