How Firebase Genkit helped add AI to our Compass app

MAY 15, 2024
Alexander Nohe Developer Relations Engineer
Arthur Thompson Developer Relations Engineer

Integrating generative AI into your app can help you differentiate your business and delight your users, but developing and refining AI-powered features beyond a prototype is still challenging. After speaking with app developers who are just beginning their AI development journey, we learned that many are overwhelmed with the number of new concepts to learn and the task of making these features scalable, secure, and reliable in production.

That is why we built Firebase Genkit, an open source framework for building sophisticated AI features into your apps with developer-friendly patterns and paradigms. It provides libraries, tooling, and plugins to help developers build, test, deploy, and monitor AI workloads. It is currently available for JavaScript/TypeScript, with Go support coming soon.

In this post, learn about some of Genkit’s key capabilities and how we used them to add generative AI into Compass, our travel planning app.


Robust developer tooling

The unique, non-deterministic nature of generative AI calls for specialized tooling to help you efficiently explore and evaluate possible solutions as you work toward consistent, production-quality outcomes.

Genkit offers a robust tooling experience through its dedicated CLI and browser-based, local developer UI. With the Genkit CLI, you can initialize an AI flow in seconds; then you can launch the developer UI to locally run it. The developer UI is a surface that lets you interact with Genkit components like flows (your end-to-end logic), models, prompts, indexers, retrievers, tools, and more. Components become available for you to run based on your code and configured plugins. This allows you to easily test against your components with various prompts and queries, and rapidly iterate on outcomes with hot reloading.

Welcome to Firebase Genkit

End-to-end observability with flows

All Genkit components are instrumented with Open Telemetry and custom metadata to enable downstream observability and monitoring. Genkit provides the “flow” primitive as a way to tie together multiple steps and AI components into a cohesive end-to-end workflow. Flows are special functions that are strongly typed, streamable, locally and remotely callable, and fully observable.

Thanks to this awesome instrumentation, when you run a flow in the developer UI, you can “inspect” it to view traces and metrics for each step and component within. These traces include the inputs and outputs for every step, making it easier to debug your AI logic or find bottlenecks that you can improve upon. You can even view traces for deployed flows executed in production.

AI flow in Genkit

Prompt management with dotprompt

Prompt engineering is more than just tweaking text. The model you use, parameters you supply, and format you request, all impact your output quality.

Genkit offers dotprompt, a file format that lets you put it all into a single file that you keep alongside your code for easier testing and organization. This means you can manage your prompts alongside your regular code, track them in the same version control system, and deploy them together. Dotprompt files allow you to specify the model and its configurations, provide flexible templating based on handlebars, and define input and output schemas so Genkit can help validate your model interactions as you develop.

---
model: vertexai/gemini-1.0-pro
config:
  temperature: 1.0
input:
  schema:
    properties:
      place: {type: string}
    required: [place]
  default:
    place: New York City
output:
  schema:
    type: object
    properties:
      hotelName: {type: string, description: "hotelName"}
      description: {type: string, description: "description"}
---

Given this location: {{place}} come up with a fictional hotel name and a
fictional description of the hotel that suites the {{place}}.

Plugin ecosystem: Google Cloud, Firebase, Vertex AI, and more!

Genkit provides access to pre-built components and integrations for models, vector stores, tools, evaluators, observability, and more through its open ecosystem of plugins built by Google and the community. For a list of existing plugins from Google and the community, explore the #genkit-plugin keyword on npm.

In our app, Compass, we used the Google Cloud plugin to export telemetry data to Google Cloud Logging and Monitoring, the Firebase plugin export traces to Cloud Firestore, and the Vertex AI plugin to get access to Google’s latest Gemini models.


How we used Genkit

To give you a hands-on look at Genkit's capabilities, we created Compass, a travel planning app designed to showcase a familiar use case.

Genkit-Inline-2 (1)

The initial versions of Compass offered a standard form-based trip planning experience, but we wondered: what would it be like to add an AI-powered trip planning experience with Genkit?


Generating embeddings for location attributes

Since we had an existing database of content, we added out-of-band embeddings for our content using the pgvector extension for Postgres and the textembedding-gecko API from Vertex AI in Go. Our goal was to enable users to search based on what each place is "known for" or a general description. To achieve this, we extracted the "knownFor" attribute for each location, generated embeddings for it, and inserted alongside the data into our existing table for efficient querying.

// generateEmbeddings creates embeddings from text provided.
func GenerateEmbeddings(
	contentToEmbed,
	project,
	location,
	publisher,
	model,
	titleOfContent string) ([]float64, error) {
	ctx := context.Background()

	apiEndpoint := fmt.Sprintf(
		"%s-aiplatform.googleapis.com:443", location)

	client, err := aiplatform.NewPredictionClient(
		ctx, option.WithEndpoint(apiEndpoint))
	handleError(err)
	defer client.Close()

	base := fmt.Sprintf(
		"projects/%s/locations/%s/publishers/%s/models",
		project,
		location,
		publisher)

	url := fmt.Sprintf("%s/%s", base, model)

	promptValue, err := structpb.NewValue(
		map[string]interface{}{
			"content":   contentToEmbed,
			"task_type": "RETRIEVAL_DOCUMENT",
			"title":     titleOfContent,
		})
	handleError(err)

	// PredictRequest: create the model prediction request
	req := &aiplatformpb.PredictRequest{
		Endpoint:  url,
		Instances: []*structpb.Value{promptValue},
	}

	// PredictResponse: receive the response from the model
	resp, err := client.Predict(ctx, req)
	handleError(err)
	pred := resp.Predictions[0]

	embeddings := pred.GetStructValue().AsMap()["embeddings"]
	embedInt, ok := embeddings.(map[string]interface{})
	if !ok {
		fmt.Printf("Cannot convert")
	}
	predSlice := embedInt["values"]
	outSlice := make([]float64, 0)
	for _, v := range predSlice.([]any) {
		outSlice = append(outSlice, v.(float64))
	}

	return outSlice, nil
}

Semantic search for relevant locations

We then created a retriever to search for semantically relevant data based on the user's query, focusing on the "knownFor" field of our locations. To achieve this, we used Genkit's embed function to generate an embedding of the user's query. This embedding is then passed to our retriever, which efficiently queries our database and returns the most relevant location results based on the semantic similarity between the query and the “knownFor” attributes.

export const placeRetriever = defineRetriever(
  {
    name: "postgres/placeRetriever",
    configSchema: QueryOptions,
  },
  async (input, options) => {
    const inputEmbedding = await embed({
      embedder: textEmbeddingGecko,
      content: input,
    });
    const results = await sql`
      SELECT ref, name, country, continent, "knownFor", tags, "imageUrl"
        FROM public.places
        ORDER BY embedding <#> ${toSql(inputEmbedding)} LIMIT ${options.k ?? 3};
    `;
    return {
      documents: results.map((row) => {
        const { knownFor, ...metadata } = row;
        return Document.fromText(knownFor, metadata);
      }),
    };
  },
);

Refining prompts

We organized our prompts as dotprompt files within a dedicated /prompts directory at the root of our Genkit project. For prompt iteration, we had two paths:

  1. In-Flow testing: Load prompts into a flow that fetches data from the retriever and feeds it to the prompt, as it will work in the end application.

2. Developer UI testing: Load the prompt into the Developer UI. This way we can update our prompts in the prompt file and instantly test the changes to the prompt to gauge the impact on output quality.

When we were satisfied with the prompt, we used the evaluator plugin to assess common LLM metrics like faithfulness, relevancy, and maliciousness using another LLM to judge the responses.


Deployed to Cloud Run

Deployment is baked into Genkit's DNA. While it naturally integrates with Cloud Functions for Firebase (along with Firebase Authentication and App Check), we opted to use Cloud Run for this project. Since we were deploying to Cloud Run, we used defineFlow, which automatically generates an HTTPS endpoint for every declared flow when deployed.

Genkit-Inline-3 (1)

Try Genkit yourself

​​Genkit streamlined our AI development process, from development through to production. The intuitive developer UI was a game-changer, making prompt iteration—a crucial part of adding our AI trip planning feature—a breeze. Plugins enabled seamless performance monitoring and integration with various AI products and services. With our Genkit flows and prompts neatly version-controlled, we confidently made changes knowing we could easily revert if needed. Explore the Firebase Genkit docs to discover how Genkit can help you add AI capabilities to your apps, and try out this Firebase Genkit codelab to implement a similar solution yourself!