Continue polling VariantStats while LLM retrieval in progress, minor UI fixes (#54)
* Prevent zoom in on iOS * Expand function return code background to fill cell * Keep OutputStats on far right of cells * Continue polling prompt stats while cells are retrieving from LLM * Add comment to _document.tsx * Fix prettier
This commit is contained in:
@@ -31,7 +31,7 @@ export const OutputStats = ({
|
||||
const cost = promptCost + completionCost;
|
||||
|
||||
return (
|
||||
<HStack align="center" color="gray.500" fontSize="2xs" mt={{ base: 0, md: 1 }}>
|
||||
<HStack w="full" align="center" color="gray.500" fontSize="2xs" mt={{ base: 0, md: 1 }}>
|
||||
<HStack flex={1}>
|
||||
{modelOutput.outputEvaluation.map((evaluation) => {
|
||||
const passed = evaluation.result > 0.5;
|
||||
|
||||
Reference in New Issue
Block a user