Journey in Rust: Api Calling and ChatGPT in Rust - Part 2

ยท

13 min read

Journey in Rust: Api Calling and ChatGPT in Rust - Part 2

Welcome back, fellow adventurers! ๐Ÿค  In the last part, we delved into the world of user input handling using the clap crate. Today, we will embark on a new quest: calling an API to retrieve some data! Let's buckle up and continue our exciting journey through the Rust-ic landscape! ๐Ÿž๏ธ

Step 1: Adding the reqwest dependency

To call an API, we need a trusty sidekick. And for that, we will enlist the help of the reqwest library. It's like having a faithful carrier pigeon that fetches the information you seek! ๐Ÿฆ

First, let's add reqwest as a dependency in our Cargo.toml file:

[dependencies]
reqwest = { version = "0.11", features = ["json"] }

This tells Cargo to bring reqwest v0.11 with the "json" feature enabled into our project. The "json" feature allows us to easily work with JSON data returned by the API.

Step 2: Creating an API request using reqwest

Now that we have our sidekick on board, let's test out how to call an API with the reqwest library. We'll start by trying to get information about a word using the Free Dictionary API.

Let's dive into the changes we've made in our main.rs file:

use std::error::Error;

async fn main() {
    let arguments = Args::parse();
    let url = format!(
        "https://api.dictionaryapi.dev/api/v2/entries/en/{}",
        arguments.query
    );
    let b = reqwest::get(url).await?;
    let results = b.json().await.expect("Error while parsing json");
    println!("{:?}", results);
}

Here's what we've done:

  • We imported the Error trait from std::error (think of it as bringing in an expert who can diagnose issues that might arise during our quest).

  • We made our main() function async to allow us to use the await keyword when calling asynchronous functions. It's like switching from a walkie-talkie to a futuristic communicator with instantaneous messaging capabilities! ๐Ÿ“ก

  • We constructed the API URL using the format! macro and our trusty user query (arguments.query).

  • We called the API using reqwest::get(url).await?. This sends our carrier pigeon on its way to fetch the data we need.

  • We parsed the JSON data returned by the API using b.json().await.expect("Error while parsing json");.

  • Finally, we printed out the parsed data using println!("{:?}", results);.

  • The .expect("Error while parsing json") is like a safety net for our program. When we're walking on the tightrope of parsing JSON data, this safety net catches us if something goes wrong. In technical terms, it's a method on the Result type that either returns the value inside an Ok variant or panics with the provided error message if it encounters an Err variant. This way, we can gracefully handle potential issues and provide a helpful error message to our users. ๐ŸŽช

Why do we need async?
Imagine you're at a theme park with a group of friends. You're in charge of getting food for everyone, but the lines are long at the different food stands. Instead of waiting in each line one by one, you split up and wait in multiple lines simultaneously. This way, you can get all the food faster and save time.
In the programming world, this theme park scenario is similar to handling asynchronous tasks. Traditionally, programs execute tasks one after another (synchronously), causing some tasks to be delayed while waiting for previous tasks to complete. With asynchronous programming (or "async"), we can run multiple tasks concurrently without blocking each other's progress, leading to better performance and more efficient use of resources.
In technical terms, async allows us to write non-blocking code that can run multiple tasks concurrently. When a task needs to wait for an external operation (e.g., a network request), it can yield control back to the runtime system, allowing other tasks to continue running.the external operation is completed, the original task can resume from where it left off. This approach helps to prevent blocking the entire application and increases overall efficiency.

Step 3: Encountering Errors

As all great adventurers know, challenges are bound to appear on your journey. In our case, we ran into two errors:

  1. The main function is not allowed to be async. To overcome this obstacle, we'll use another powerful ally: the tokio runtime. Tokio is like an enchanted steed that gallops through the land of async functions and helps us navigate the treacherous terrain. ๐ŸŽ

  2. The ? operator can only be used in an async block that returns a Result or Option (or another type that implements FromResidual). Right now, our main function returns (), which won't work with the ? operator.

Let's see how we can overcome these challenges!

Step 4: Introducing Tokio

First, we'll add the tokio runtime as a dependency in our Cargo.toml file:

[dependencies]
tokio = { version = "1", features = ["full"] }

Now, let's fix the errors in our main.rs file:

#[tokio::main]
async fn main() {
    // The rest of your code
}

We added the #[tokio::main] attribute to our main() function. This tells Rust that we want to use the Tokio runtime to handle our async functions, allowing us to keep the async keyword on the main function.

The #[tokio::main] attribute is used to declare the entry point of an asynchronous application using the Tokio runtime. It transforms your async fn main() into a synchronous function that initializes the Tokio runtime and then runs your asynchronous code to completion.

In other words, it's like a bridge that allows your async functions to run within the context of the main function, which is typically synchronous. By using #[tokio::main], you're telling Rust to set up the necessary environment for executing async functions, enabling you to use async/await in your main function seamlessly.

This is important because, without it, you wouldn't be able to directly use async functions and .await inside your main function, which is required when working with libraries like reqwest for making HTTP requests. The Tokio runtime provides the necessary infrastructure to manage asynchronous tasks and execute them efficiently.

However, we still need to address the second error.

Step 5: Handling errors with the Result

Instead of returning (), we can change the return type of our main function to Result<(), Box<dyn Error>>. This way, our function will return either an empty tuple (()) or an error wrapped inside a Box (a heap-allocated container for holding any error implementing the Error trait). It's like having a special box that can hold any problem we might face! ๐Ÿ“ฆ

#[tokio::main]
async fn main() -> Result<(), Box<dyn Error>> {
    // The rest of your code
}
cargo run -- --query=hello

Step 6: Fixing the type inference error

We encountered another error during our journey:

error[E0698]: type inside `async` block must be known in this context
  --> src/main.rs:27:20
   |
27 |     let results= b.json().await.expect("Error while parsing json");
   |                    ^^^^ cannot infer type for type parameter `T` declared on the associated function `json`

Fear not, my fellow explorers! We can easily overcome this obstacle. The problem here is that Rust cannot infer the type of data we're expecting from the API. To fix this issue, we need to create a structure (or structures) that represents the shape of the data returned by the API.

Let's create some new structures and update our code accordingly:

#[derive(Deserialize, Debug, Serialize)]
struct ApiResponse {
    word: String,
    phonetic: Option<String>,
    phonetics: Option<Vec<Phonetic>>,
    origin: Option<String>,
    meanings: Vec<Meaning>,
}

#[derive(Deserialize, Debug, Serialize)]
struct Phonetic {
    text: String,
    audio: Option<String>,
}

#[derive(Deserialize, Debug, Serialize)]
struct Meaning {
    partOfSpeech: String,
    definitions: Vec<Definition>,
}

#[derive(Deserialize, Debug, Serialize)]
struct Definition {
    definition: String,
    example: Option<String>,
    synonyms: Option<Vec<String>>,
    antonyms: Option<Vec<String>>,
}

#[derive(Deserialize, Debug, Serialize)] is like a magic potion that grants your data structures three superpowers:

  1. Deserialize: Transforms JSON (or other serialized formats) back into Rust structures. It's like a translator who reads foreign texts and explains them in your native language.

  2. Debug: Allows you to print the structure in a human-readable format for easy debugging. Think of it as a friendly guide who describes the contents of a mysterious artifact.

  3. Serialize: Converts Rust structures into JSON (or other serialized formats). It's like an expert scribe who takes your message and writes it in a language that others can understand.

These structures represent the hierarchy of data we expect to receive from the API. Now we can update our call to b.json().await to specify the expected type as Vec<ApiResponse>:

let results: Vec<ApiResponse> = b.json().await.expect("Error while parsing json");

Now our code should run successfully! ๐ŸŽ‰

Step 7: Pretty-printing the results

To make our output more readable, we'll use the serde_json library's to_string_pretty function. First, add the serde_json dependency to your Cargo.toml file:

[dependencies]
serde_json = "1.0.95"

Now let's update our main.rs file to pretty-print the results:

use serde_json::to_string_pretty;

// ...

async fn main() -> Result<(), Box<dyn Error>> {
    // ...
    let pretty_results = to_string_pretty(&results[0].meanings).expect("Error while pretty-printing results");
    println!("{}", pretty_results);
    Ok(())
}

We used the to_string_pretty function to convert our parsed JSON data into a nicely formatted string, and then printed it out.

Passing &results to &results is similar to sharing the same treasure map with your fellow adventurers. Instead of creating multiple copies of the map, you're using a single reference (the ampersand &) to access the same data (results). This allows for efficient memory usage and prevents unnecessary duplication while still enabling everyone to follow the same path to the treasure trove of information! Technically, this means we're passing an immutable reference to the results variable instead of creating a new copy of it.

we will explore how to use environment variables to store sensitive data such as API keys and integrate them into our project. Let's saddle up and continue our ride through the Rust-ic landscape! ๐Ÿด

Step 8: Storing sensitive data with dotenv

It's not safe to store sensitive information like API keys directly in our code. Instead, we should use environment variables to keep this information secure. To make working with environment variables easier, we'll use the dotenv crate. Think of it as a magical chest that safely stores our precious secrets! ๐Ÿ—๏ธ

First, let's add the dotenv dependency in our Cargo.toml file:

[dependencies]
dotenv = "0.15.0"

Now, let's create a .env file at the root of our project:

OPEN_AI_API_KEY=your_api_key_here

Be sure to replace your_api_key_here with your actual API key.

Note: Don't forget to add .env to your .gitignore file so you don't accidentally push sensitive information to your repository!

With our secrets safely stored, it's time to retrieve them in our code.

Step 9: Accessing environment variables with dotenv

Let's see how we can access the OPEN_AI_API_KEY environment variable in our main.rs file:

use std::{error::Error, env};
use dotenv::dotenv;

#[tokio::main]
async fn main() -> Result<(), Box<dyn Error>> {
    dotenv().ok();
    let arguments = Args::parse();
    let open_ai_api_key = env::var("OPEN_AI_API_KEY").expect("OPEN_AI_API_KEY not set");

    // The rest of your code

    Ok(())
}

Here's what we've done:

  • We imported env from the std crate to work with environment variables.

  • We imported the dotenv function from the dotenv crate to load our .env file.

  • We called dotenv().ok(); at the beginning of our main() function to load our environment variables from the .env file.

  • We retrieved the OPEN_AI_API_KEY environment variable using env::var("OPEN_AI_API_KEY"). This returns a Result<String, VarError>, so we used .expect("OPEN_AI_API_KEY not set") to unwrap the value or panic if it's not set.

Now that we have our API key safely stored and accessible, we can proceed to call our desired API!

Step 10: Integrating with the OpenAI API

We will now transition from using the Free Dictionary API to the OpenAI API. To do this, let's first remove the unused structs in our main.rs file:

// Remove these structs, as we won't be using them anymore:
struct ApiResponse { ... }
struct Phonetic { ... }
struct Meaning { ... }
struct Definition { ... }
๐Ÿงญ

we learned how to use environment variables to store sensitive data like API keys. Now, we'll explore how to make requests to the OpenAI API and handle the responses. Let's get our engines started and dive right into it! ๐Ÿš€

To begin with, let's create the necessary structs to represent the OpenAI API response:

#[derive(Debug, Deserialize)]
struct ApiResponse {
    choices: Vec<Choice>,
}

#[derive(Debug, Deserialize)]
struct Choice {
    message: Message,
}

#[derive(Debug, Deserialize)]
struct Message {
    content: String,
}

Here, we've created three structs:

  • ApiResponse represents the top-level response object from the API. It contains a choices field that holds an array of Choice objects.

  • Choice represents a choice returned by the API. It contains a message field that holds a Message object.

  • Message represents the actual message returned by the API. It contains a content field that holds the text of the message as a string.

With these structs in place, we can now make requests to the OpenAI API and deserialize the JSON responses into Rust structures.

Step 11: Making requests to the OpenAI API

Let's see how we can send a request to the OpenAI API and handle the response:

use reqwest::{
    header::{self, HeaderMap, HeaderValue},
    Client,
};
use serde_json::json;

// ... (other code)

#[tokio::main]
async fn main() -> Result<(), Box<dyn Error>> {
    dotenv().ok();
    let arguments = Args::parse();
    let open_ai_api_key = env::var("OPEN_AI_API_KEY").expect("OPEN_AI_API_KEY not set");
    let query = arguments.query.to_owned();
    let client = Client::new();

    let url = "https://api.openai.com/v1/chat/completions";

    let headers: HeaderMap<HeaderValue> = header::HeaderMap::from_iter(vec![
        (header::CONTENT_TYPE, "application/json".parse().unwrap()),
        (
            header::AUTHORIZATION,
            format!("Bearer {}", open_ai_api_key).parse().unwrap(),
        ),
    ]);

    let body = json!(
        {
            "model":"gpt-3.5-turbo",
            "messages":[{
                "role":"user",
                "content": query,
            }]
        }
    );

    let response: ApiResponse = client
        .post(url)
        .headers(headers)
        .json(&body)
        .send()
        .await?
        .json()
        .await?;

    println!("{}", &response.choices[0].message.content);

    Ok(())
}

Here's what we've done:

  • Imported the necessary dependencies from reqwest to create an HTTP client and handle request headers.

  • Created an instance of reqwest::Client.

  • Defined the URL for the OpenAI API endpoint.

  • Created a HeaderMap with our custom headers, including the Content-Type and Authorization headers. The Authorization header uses our open_ai_api_key from the environment variables.

  • Created a JSON payload using the json! macro from serde_json. We set the model to "gpt-3.5-turbo" and provide our query as the content of a user message.

  • Sent a POST request to the API with our headers and JSON payload, awaited the response, and deserialized it into an instance of our ApiResponse struct.

  • Printed the content of the first choice's message to the console.

cargo run -- -q "Convert mov to mp4 using ffmpeg"

To convert a MOV video file to MP4 using FFmpeg, follow these steps:

1. Download and install FFmpeg if you haven't already.
2. Open a Command Prompt or Terminal window on your computer.
3. Navigate to the directory where your MOV file is located using the "cd" command.
4. Use the following command to convert the MOV file to MP4:

`ffmpeg -i input.mov -c:v libx264 -preset slow -crf 22 -c:a copy output.mp4`

Additional information:

Whats the Client here we weren't using it before??
Client is an essential part of the reqwest library, allowing us to create and manage HTTP requests. Think of it as a browser that can send requests and receive responses from web servers on our behalf.
Before, we were only working with local environment variables and command line arguments. But now, we want to interact with external APIs like OpenAI's, so we need the Client to handle the communication. It offers a user-friendly interface to send requests, set custom headers, and process responses, making it easier for us to work with APIs in Rust.

Whats the header::HeaderMap::from_iter ??
header::HeaderMap::from_iter is a method provided by the reqwest crate that allows you to create a HeaderMap (a collection of HTTP headers) from an iterator. The iterator should yield key-value pairs where the key is a header name, and the value is the corresponding header value.
Think of it like arranging a set of ingredients in a specific order before cooking. The HeaderMap::from_iter method takes these "ingredients" (header name-value pairs) and neatly places them in the right spots to create a proper HeaderMap. This organized structure makes it easy to use the headers while making HTTP requests with reqwest.

With these changes, our command line tool can now make requests to the OpenAI API and display the results! ๐ŸŽ‰

In the next section, we'll look into making this response more tailored.

Cover: Bhupesh

๐Ÿ”— Repository: termoil

ย