Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: chat streaming and loading states (WIP) #247

Closed
wants to merge 49 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
49 commits
Select commit Hold shift + click to select a range
41dea5a
Update body.ts
hannah-bethclark Sep 2, 2024
c1e52e2
Merge main
stefl Sep 23, 2024
9c027ed
fix: chat streaming and loading states
stefl Oct 18, 2024
72aae68
Remove number of patches from extractPatches
stefl Oct 18, 2024
9f8a7b2
Remove console log
stefl Oct 18, 2024
924f8e1
Merge branch 'main' into fix/stream_status
stefl Oct 18, 2024
f377197
Use a new approach to manage the chat state
stefl Oct 18, 2024
50fbead
Merge remote-tracking branch 'origin/main' into fix/stream_status
stefl Oct 18, 2024
ece8d33
Use a new approach for adding the stream status to the chat
stefl Oct 18, 2024
b4e864c
Behaviour improvements for streaming and scrolling
stefl Oct 19, 2024
bf7d7f1
Linting
stefl Oct 19, 2024
e4eeb53
Linting and small fixes
stefl Oct 19, 2024
8ecd5ba
Adjust the prompt structure and Continue action so that we generate f…
stefl Oct 20, 2024
bf34d8c
Ensure text is brief
stefl Oct 20, 2024
a762ae3
Merge remote-tracking branch 'origin/hannah-bethclark-patch-1' into f…
stefl Oct 20, 2024
66bb244
Romans update
stefl Oct 20, 2024
20ca3c0
Stop it picking a basedOn
stefl Oct 20, 2024
ac3df73
Try to make it not pick a basedOn
stefl Oct 20, 2024
b093f55
Continue trying to stop it doing basedOn and respond correctly
stefl Oct 20, 2024
a3aa37d
basedOn and learning outcomes in one
stefl Oct 20, 2024
0c198da
Try even harder for basedOn
stefl Oct 20, 2024
9c2bf1f
Make it clearer how to check for basedOn
stefl Oct 20, 2024
d8c7a7b
More basedOn
stefl Oct 20, 2024
75916b4
Add placeholders
stefl Oct 20, 2024
2a8f82d
Try putting the user prompting rules at the end
stefl Oct 20, 2024
109956c
Lint / test fixes
stefl Oct 21, 2024
5a4a03d
Fix patch enqueuer test
stefl Oct 21, 2024
b501fe9
Regenerate CSP snapshot tests
stefl Oct 21, 2024
668246b
Fix Aila bug with categorisation and successful romans mobile test
stefl Oct 21, 2024
7efc429
CSP snapshot
stefl Oct 21, 2024
3653437
Mock the categoriser in test
stefl Oct 21, 2024
76ed651
Catch and rethrow if there is an openai stream error
stefl Oct 21, 2024
0753fde
More definition in the LLM message schema. Ask for intent on which se…
stefl Oct 21, 2024
4f7438d
Ask the LLM to plan what it will edit and return what it edited
stefl Oct 21, 2024
1d939e2
cycles is not a valid section ref
stefl Oct 24, 2024
b65c48e
Merge branch 'main' into fix/stream_status
stefl Oct 24, 2024
7bd8d2e
Add .gitattributes file to hide e2e fixtures in github diffs
stefl Oct 24, 2024
e754518
Hide generated JSON fixtures
stefl Oct 24, 2024
2fd8a10
Hide more recordings
stefl Oct 24, 2024
8cf62ed
Merge branch 'main' into fix/stream_status
stefl Oct 24, 2024
28e2306
Merge branch 'main' into fix/stream_status
stefl Oct 24, 2024
db05b35
Merge remote-tracking branch 'origin/main' into fix/stream_status
stefl Oct 29, 2024
a9c294c
Merge branch 'main' into fix/stream_status
stefl Oct 30, 2024
87b2de6
Merge branch 'main' into fix/stream_status
stefl Oct 30, 2024
244d4db
Fix merge issue, add extra logging
stefl Oct 30, 2024
62a0ed8
Add extra logging
stefl Oct 30, 2024
1cb69f4
Merge branch 'fix/stream_status' of https://github.com/oaknational/oa…
stefl Oct 30, 2024
b591254
More logging
stefl Oct 30, 2024
b55bd3d
Fix lesson plan keys detection
stefl Oct 30, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,7 @@
"hget",
"hgetall",
"hmset",
"hotfixes",
"hset",
"Hubspot",
"initialisation",
Expand Down Expand Up @@ -131,6 +132,7 @@
"rushstack",
"Sedar",
"slidedeck",
"sslcert",
"sslmode",
"SUBJ",
"superjson",
Expand All @@ -147,6 +149,7 @@
"trivago",
"trpc",
"Turbopack",
"TURBOPACK",
"turborepo",
"uidotdev",
"unjudged",
Expand Down
4 changes: 2 additions & 2 deletions apps/nextjs/src/app/aila/page-contents.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,8 @@ const ChatPageContents = ({ id }: { id: string }) => {
return (
<Layout>
<LessonPlanTrackingProvider chatId={id}>
<ChatProvider key={`chat-${id}`} id={id}>
<Chat />
<ChatProvider key={`chat-provider-${id}`} id={id}>
<Chat key={`chat-${id}`} />
</ChatProvider>
</LessonPlanTrackingProvider>
</Layout>
Expand Down
Original file line number Diff line number Diff line change
@@ -1,10 +1,42 @@
import { useMemo, useEffect } from "react";

import type { LessonPlanKeys } from "@oakai/aila/src/protocol/schema";
import { aiLogger } from "@oakai/logger";
import type { Message } from "ai";

import { allSectionsInOrder } from "../../../../../lib/lessonPlan/sectionsInOrder";

const log = aiLogger("chat");

function findStreamingSections(message: Message | undefined): {
streamingSections: LessonPlanKeys[];
streamingSection: LessonPlanKeys | undefined;
content: string | undefined;
} {
if (!message?.content) {
return {
streamingSections: [],
streamingSection: undefined,
content: undefined,
};
}
log.info("Parsing message content", message.content);
const { content } = message;
const regex = /"path":"\/([^/"]*)/g;
const pathMatches =
content
.match(regex)
?.map((match) => match.replace(/"path":"\//, "").replace(/"$/, "")) ?? [];

const streamingSections: LessonPlanKeys[] = pathMatches.filter(
(i): i is string =>
typeof i === "string" && allSectionsInOrder.includes(i as LessonPlanKeys),
) as LessonPlanKeys[];
const streamingSection: LessonPlanKeys | undefined =
streamingSections[streamingSections.length - 1];
return { streamingSections, streamingSection, content };
}

export type AilaStreamingStatus =
| "Loading"
| "RequestMade"
Expand All @@ -18,36 +50,50 @@ export const useAilaStreamingStatus = ({
}: {
isLoading: boolean;
messages: Message[];
}): AilaStreamingStatus => {
const ailaStreamingStatus = useMemo<AilaStreamingStatus>(() => {
const moderationStart = "MODERATION_START";
const chatStart = "CHAT_START";
if (messages.length === 0) return "Idle";
}): {
status: AilaStreamingStatus;
streamingSection: LessonPlanKeys | undefined;
streamingSections: LessonPlanKeys[] | undefined;
} => {
const { status, streamingSection, streamingSections } = useMemo(() => {
const moderationStart = `MODERATION_START`;
const chatStart = `CHAT_START`;
if (messages.length === 0)
return {
status: "Idle" as AilaStreamingStatus,
streamingSection: undefined,
};
const lastMessage = messages[messages.length - 1];

let status: AilaStreamingStatus = "Idle";
log.info("Find streaming sections", lastMessage);
const { streamingSections, streamingSection, content } =
findStreamingSections(lastMessage);

if (isLoading) {
if (!lastMessage) return "Loading";
const { content } = lastMessage;
if (lastMessage.role === "user") {
return "RequestMade";
} else if (content.includes(moderationStart)) {
return "Moderating";
} else if (
content.includes('"type":"prompt"') ||
content.includes('\\"type\\":\\"prompt\\"')
) {
return "StreamingChatResponse";
} else if (content.includes(chatStart)) {
return "StreamingLessonPlan";
if (!lastMessage || !content) {
status = "Loading";
} else {
if (lastMessage.role === "user") {
status = "RequestMade";
} else if (content.includes(moderationStart)) {
status = "Moderating";
} else if (content.includes(`"type":"text"`)) {
status = "StreamingChatResponse";
} else if (content.includes(chatStart)) {
status = "StreamingLessonPlan";
Comment on lines +77 to +84
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this could be extracted to a getStatus function that returns status, rather than needing to set and modify the status variable. Then the streamingSection(s) logic can be independent

} else {
status = "Loading";
}
}
return "Loading";
}
return "Idle";

return { status, streamingSections, streamingSection };
}, [isLoading, messages]);

useEffect(() => {
log.info("ailaStreamingStatus set:", ailaStreamingStatus);
}, [ailaStreamingStatus]);
log.info("ailaStreamingStatus set:", status);
}, [status]);

return ailaStreamingStatus;
return { status, streamingSection, streamingSections };
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adds the slug of the section that is streaming, as well as all sections that have or are streaming

};
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
import { useMemo } from "react";

import type { LooseLessonPlan } from "@oakai/aila/src/protocol/schema";
import type {
LessonPlanKeys,
LooseLessonPlan,
} from "@oakai/aila/src/protocol/schema";
import { lessonPlanSectionsSchema } from "@oakai/exports/src/schema/input.schema";
import type { ZodIssue } from "zod";

Expand All @@ -16,17 +19,20 @@ function getCompleteness(errors: ZodIssue[], fields: string[]) {

return !hasErrorInSomeField;
}
export type ProgressSections = {
label: string;
key: string;
complete: boolean;
}[];
export type ProgressSections = ProgressSection[];

type ProgressForDownloads = {
sections: ProgressSections;
sections: ProgressSection[];
totalSections: number;
totalSectionsComplete: number;
};

type ProgressSection = {
label: string;
key: LessonPlanKeys;
complete: boolean;
};

export function useProgressForDownloads({
lessonPlan,
isStreaming,
Expand Down Expand Up @@ -60,7 +66,8 @@ export function useProgressForDownloads({
return true;
}
}) || [];
const sections = [

const sections: ProgressSection[] = [
{
label: "Lesson details",
key: "title",
Expand Down
Loading
Loading