Full Stack LlamaIndex Example with Next.js

In this article, we will create a simple starter example using LlamaIndex with Next.js.

Next.js doesn't play nicely out of the box with LlamaIndex, so this guide might help.

We will walk through the set up of a basic application that processes a user's question and returns a response.

Setup Next.js Project

First, let's create a new Next.js project. Use Tailwind and TypeScript to make sure you can follow along with this guide.

Open your terminal and run the following command and follow the prompts:

npx create-next-app@latest

Once your project is created change into the project directory and add a couple of dependencies we will use:

npm install llamaindex sonner

Configure Next.js

Create a new file named next.config.js in the root of your project and add the following configuration:

/** @type {import('next').NextConfig} */
const nextConfig = {
  // (Optional) Export as a static site
  output: "export", // Feel free to modify/remove this option

  // Override the default webpack configuration
  webpack: (config) => {
    // See https://webpack.js.org/configuration/resolve/#resolvealias
    config.resolve.alias = {
      ...config.resolve.alias,
      sharp$: false,
      "onnxruntime-node$": false,
    };
    return config;
  },
};

export default nextConfig;

Environment Variables

You will also need an OpenAI API key to follow along. Grab it [here] and then add it to your .env file:

OPENAI_API_KEY=YOUR_KEY_HERE

The naming is important as it is what LlamaIndex expects by default.

Create Frontend Code

In the root folder of your app directory, update your page.tsx file to have the following:

"use client";

import { useState } from "react";
import { Toaster, toast } from "sonner";

export default function Home() {
  const [response, setResponse] = useState("");
  const [loading, setLoading] = useState(false);
  const [inputText, setInputText] = useState("");

  const handleSubmit = async (e: React.FormEvent<HTMLFormElement>) => {
    e.preventDefault();
    if (!inputText) {
      toast.error("Please provide a question");
      return;
    }
    try {
      setLoading(true);
      const res = await fetch("/api/rag", {
        headers: {
          "Content-Type": "application/json",
        },
        method: "POST",
        body: JSON.stringify({ query: inputText }),
      });

      const data = await res.json();

      setInputText("");
      setResponse(data.response);
      setLoading(false);
    } catch (error) {
      console.error(error);
      toast.error("Something went wrong");
      setLoading(false);
    }
  };

  return (
    <main className="flex min-h-screen flex-col bg-gradient-to-b from-gray-900 to-gray-800">
      <Toaster />
      <form onSubmit={handleSubmit} className="flex flex-col pt-20 items-center">
        <input
          value={inputText}
          onChange={(e) => setInputText(e.target.value)}
          type="text"
          name="question"
          placeholder="Ask me something..."
          className="p-4 border border-gray-300 rounded-lg max-w-96 w-full text-gray-900 placeholder:text-gray-500"
        />

        <button
          disabled={loading}
          type="submit"
          className="p-4 bg-blue-600 hover:bg-blue-500 text-white rounded-lg max-w-96 w-full mt-4"
        >
          {loading ? (
            <span className="animate-pulse">Aligning satellites 📡</span>
          ) : (
            "Submit"
          )}
        </button>
      </form>
      {response && (
        <div className="flex justify-center">
          <div className="max-w-[720px] text-gray-50 p-6 border border-gray-500 bg-gray-700 rounded m-8">
            {response}
          </div>
        </div>
      )}
    </main>
  );
}

This will setup all of your UI to create and handle your response from the API.

Create Backend Code

In the pages directory, create a new folder named api and then rag inside that folder (this is so we can have an API with the endpoint api/rag. Inside this folder, create a file named route.ts. Add the following code:

import fs from "node:fs/promises";
import { Document, VectorStoreIndex, OpenAI, Settings } from "llamaindex";

Settings.llm = new OpenAI({ model: "gpt-3.5-turbo" });

export async function POST(request: Request) {
  try {
    const { query } = await request.json();
    if (!query) {
      throw new Error("Input is required");
    }

    // Load essay from abramov.txt in Node
    const path = "./node_modules/llamaindex/examples/abramov.txt";

    const essay = await fs.readFile(path, "utf-8");
    // Create Document object with essay
    const document = new Document({ text: essay, id_: path });

    // Split text and create embeddings. Store them in a VectorStoreIndex
    const index = await VectorStoreIndex.fromDocuments([document]);

    // Query engine convenience function
    const queryEngine = index.asQueryEngine();
    const { response } = await queryEngine.query({
      query,
    });

    return Response.json({ response });
  } catch (err) {
    console.log(err);
    return new Response(`Something went wrong.`, { status: 400 });
  }
}

Run Your Application

Now, you can test your Next.js application by running:

npm run dev

I hope this example can serve as a starting point for building more complex applications.

NextjsAiLlamaindex
Avatar for Niall Maher

Written by Niall Maher

Founder of Codú - The web developer community! I've worked in nearly every corner of technology businesses: Lead Developer, Software Architect, Product Manager, CTO, and now happily a Founder.

Loading

Fetching comments

Hey! 👋

Got something to say?

or to leave a comment.