Adding Personality to ChatGPT-3
Building with OpenAI's APIs is fun. By leveraging Convex to store your data and run server-side functions, you can have a GPT-powered app in no time.
This is a follow-up to Building a full-stack ChatGPT app, where we set up a Chat-GPT-like chat app using Convex as the backend. In that example, we used a fixed identity string. In this post we’ll look at how to store multiple personalities in the database and provide them in the chat, to enable changing personalities mid-conversation.
See the code here.
Convex is the sync platform with everything you need to build your full-stack project. Cloud functions, a database, file storage, scheduling, search, and realtime updates fit together seamlessly.
To customize the identity you pass into ChatGPT, you can pass a “system” message to the OpenAI API. See more details on this API here. In our case, before having multiple identities, the code looks like this:
const openaiResponse = await openai.createChatCompletion({
model: "gpt-3.5-turbo",
messages: [
{
role: "system",
content: instructions,
},
...messages.map(({ body, author }) => ({
role: author,
content: body,
})),
],
});
Let’s store multiple identities in the Convex database and pick which one to use.
Adding and listing identities
Let’s make convex/identity.js
:
import { query, mutation } from "./_generated/server";
export const list = query(async (ctx) => {
const identities = await ctx.db.query("identities").collect();
return identities.map((identity) => identity.name);
});
export const add = mutation(async (ctx, { name, instructions }) => {
const existing = await ctx.db
.query("identities")
.filter((q) => q.eq(q.field("name"), name))
.unique();
if (existing) {
await ctx.db.patch(existing._id, { instructions });
} else {
await ctx.db.insert("identities", { name, instructions });
}
});
This will run on the server and list the identity names or allow you to add one. Before it adds one, it will check if it already exists and, if so, update it, so there aren’t duplicate names. Because of Convex’s transaction isolation (which is the strictest kind, called “serializable isolation”), we’ll never have duplicate identities with the same name, simply because we queried for that name first. Read more about that here.
Note that by only returning the identity names, we are hiding the instructions from the clients. This is a big difference between Convex and a platform like Firebase that gives you direct access to databases from the browser. By running code on the server, we can use code to limit what information is returned to clients.
UI for adding identities
Adding a UI to add an identity is similar to sending messages with an <AddIdentity />
component:
AddIdentity
component
function AddIdentity() {
const addIdentity = useMutation("identity:add");
const [newIdentityName, setNewIdentityName] = useState("");
const [newIdentityInstructions, setNewIdentityInstructions] = useState("");
return (
<form
onSubmit={async (e) => {
e.preventDefault();
await addIdentity({
name: newIdentityName,
instructions: newIdentityInstructions,
});
setNewIdentityName("");
setNewIdentityInstructions("");
}}
>
<input
value={newIdentityName}
onChange={(event) => setNewIdentityName(event.target.value)}
placeholder="Identity Name"
/>
<textarea
value={newIdentityInstructions}
onChange={(event) => setNewIdentityInstructions(event.target.value)}
placeholder="GPT3 Instructions"
rows={2}
cols={40}
/>
<input
type="submit"
value="Add Identity"
disabled={!newIdentityName || !newIdentityInstructions}
/>
</form>
);
}
Let’s add some identities. Some I like:
Rubber Duck
You are curious and respond with helpful one-sentence questions.
Supportive Friend
You are a supportive and curious best friend who validates feelings and experiences and will give advice only when asked for it. You give short responses and ask questions to learn more.
CS Coach
You are a highly technically trained coach with expertise in technology and best practices for developing software. Respond with concise, precise messages and ask clarifying questions when things are unclear.
Using an identity from the UI
To use an identity, we can add a select
in our form to pick an identity, and pass that identity name in the openai:chat
mutation:
function App() {
const messages = useQuery(api.messages.list) || [];
const sendMessage = useAction("openai:chat");
const [newMessageText, setNewMessageText] = useState("");
const identities = useQuery(api.identity:list) || [];
const [identityName, setIdentityName] = useState("");
//...
<form
onSubmit={(e) => {
e.preventDefault();
setNewMessageText("");
sendMessage(newMessageText, identityName);
}}
>
<select
value={identityName}
onChange={(e) => setIdentityName(e.target.value)}
>
<option value={""} disabled={identityName}>
Select an identity
</option>
{identities.map((name) => (
<option key={name} value={name}>
{name}
</option>
))}
</select>
//...
Looking up the identity
In our messages:send
function, which runs on the server, we can accept the new identityName
and look up the identity’s instructions to pass to ChatGPT:
export const send = internalMutation(async (ctx, { body, identityName }) => {
//...
const identity = await ctx.db
.query("identities")
.filter((q) => q.eq(q.field("name"), identityName))
.unique();
if (!identity) throw new Error("Unknown identity: " + identityName);
const botMessageId = await ctx.db.insert("messages", {
author: "assistant",
identityId: identity._id,
});
//...
return { messages, botMessageId, instructions: identity.instructions };
Two things to note:
- We search for the identity by name. If there were a lot of them, we’d use an index, but for small tables doing a filter is just fine.
- We’re storing the
identityId
into the bot’s message, so we can later know which identity responded to each message.
We can also pass the instructions for each bot message, by looking up the instructions if there is an identityId
:
await Promise.all(
messages.map(async (msg) => {
if (msg.identityId) {
msg.instructions = (await ctx.db.get(msg.identityId)).instructions;
}
})
);
We look up each identity by doing ctx.db.get
in a separate async function, and use Promise.all
to wait for the result of each of the async functions.
All together, send
now looks like:
export const send = internalMutation(async (ctx, { body, identityName }) => {
await ctx.db.insert("messages", {
body,
author: "user",
});
const identity = await ctx.db
.query("identities")
.filter((q) => q.eq(q.field("name"), identityName))
.unique();
if (!identity) throw new Error("Unknown identity: " + identityName);
const botMessageId = await db.insert("messages", {
author: "assistant",
identityId: identity._id,
});
const messages = await ctx.db
.query("messages")
.order("desc")
.filter((q) => q.neq(q.field("body"), undefined))
.take(10);
messages.reverse();
await Promise.all(
messages.map(async (msg) => {
if (msg.identityId) {
msg.instructions = (await ctx.db.get(msg.identityId)).instructions;
}
})
);
return { messages, botMessageId, instructions: identity.instructions };
});
Passing the identity to ChatGPT
In our convex/openai.js
chat
function, we can now de-structure the instructions
return instead of hard-coding it:
export const chat = action(async (ctx, { body, identityName }) => {
const { messages, botMessageId, instructions } =
await ctx.runMutation(internal.messages.send, { body, identityName });
//...
Now the response will be a function of what messages have been sent, and what the identity’s instructions are.
To also give GPT context on what its identity was for previous messages, we can sprinkle system messages into the messages we pass to the API:
const gptMessages = [];
let lastInstructions = null;
for (const { body, author, instructions } of messages) {
if (instructions && instructions !== lastInstructions) {
gptMessages.push({
role: "system",
content: instructions,
});
lastInstructions = instructions;
}
gptMessages.push({ role: author, content: body });
}
if (instructions !== lastInstructions) {
gptMessages.push({
role: "system",
content: instructions,
});
lastInstructions = instructions;
}
const openaiResponse = await openai.createChatCompletion({
model: "gpt-3.5-turbo",
messages: gptMessages,
});
The goal here is to remind GPT what identity it had for each message it sent. This API is new, so I bet there’s a lot of tweaking we can do with the prompts here to make it work better. For instance, maybe we should be telling it “Ignore previous system instructions for the following messages. Now act like this: (new instructions)”. If you get something working in a better way, let me know in Discord!
Showing the identity’s name instead of “assistant” in the UI
Currently, all the messages in the UI say “assistant” - let’s show the identity we named by looking up the identities dynamically. Convex supports relationships in this way.
When listing our messages, we can look up the identity’s name to return. In convex/messages.ts
:
export const list = query(async (ctx) => {
const messages = await ctx.db.query("messages").take(20);
return Promise.all(
messages.map(async (message) => {
if (message.identityId) {
const identity = await ctx.db.get(message.identityId);
message.identityName = identity.name;
}
return message;
})
);
});
We look up each identity by doing db.get
in a separate async function, and use Promise.all
to return the result of each of the functions as an array. This is the common pattern for doing join-like behavior with Convex. This way you can execute the reads in parallel but do all your business logic and lookups in JavaScript.
On the client, we can now show the identityName
if it’s defined instead of just the author in App.jsx
:
<span>{message.identityName ?? message.author}: </span>
And we can also default the chat identity to be the last-used identity message, when you load the page with a useEffect:
useEffect(() => {
if (identities.length && !identityName) {
const lastMessage = messages[messages.length - 1];
if (lastMessage && identities.indexOf(lastMessage.identityName) !== -1) {
setIdentityName(lastMessage.identityName);
} else {
setIdentityName(identities[0]);
}
}
}, [messages, identities, identityName]);
Summary
We’ve now added the ability to change the instructions to ChatGPT on how to respond, so we can have a conversation and be changing the identity mid-stream. See the code here. Note, that code has some more features than just what we’ve done in this post.
Convex is the sync platform with everything you need to build your full-stack project. Cloud functions, a database, file storage, scheduling, search, and realtime updates fit together seamlessly.