Logo for Sneaky Crow, LLC (vector art of a crow with a green baseball cap on)

Brain Juice

Long-running backend async tasks in tauri v2

Introduction

Welcome to our tutorial on running backend async tasks in Tauri v2 and seamlessly pushing data to the frontend! In this guide, we'll delve into the process of orchestrating asynchronous backend operations while synchronously updating the frontend. Our example project revolves around crafting a Twitch Chat application, where we'll establish a connection to Twitch's IRC, fetch messages, and dynamically present them within our application interface.

The core of our approach lies in leveraging Tauri's robust event system to seamlessly bridge the gap between backend processes and frontend presentation. By the end of this tutorial, you'll have a clear understanding of how to synchronize backend async tasks with frontend data updates, empowering you to build responsive and dynamic applications with Tauri.

Notes

For this tutorial, we'll be using Tauri version 2.0.0-beta. Because this is a beta, some of the code may change in the future.

Initializing Project

To kickstart our journey, ensure you have Tauri v2 and its associated tooling installed. If not, you can swiftly set it up by installing create-tauri-app using Cargo:

1
cargo install create-tauri-app

Then, initialize your Tauri application using:

1
cargo create-tauri-app --beta

While we'll be utilizing React for the frontend in this tutorial, feel free to adapt the frontend framework of your choice. The crux of our focus, however, will lie in configuring and executing backend tasks efficiently.

Explore alternative installation methods via Tauri's official documentation

For a visual walkthrough of the setup process, watch the following clip:

Sending Data from Backend

Now that our project is primed, let's dive into creating long-running async tasks capable of dispatching data to the frontend. Our objective? Establish a connection to Twitch's chat service, continuously fetch incoming messages, and seamlessly relay them to our application's frontend.

We'll achieve this by spawning an async task that orchestrates the connection to Twitch's IRC, diligently processing incoming messages. Additionally, we'll equip this task with the capability to dispatch these messages to the frontend using event handlers.

To streamline your setup, we'll specify the necessary dependencies, demonstrate how to initialize the Twitch IRC connection within the chat module, and configure the Tauri builder to seamlessly integrate our newly minted chat module.

Add twitch-irc as a dependency in src-tauri/Cargo.toml. Here's all the dependencies we have at this point:

1
[build-dependencies]
2
tauri-build = { version = "2.0.0-beta", features = [] }
3

4
[dependencies]
5
tauri = { version = "2.0.0-beta", features = [] }
6
tauri-plugin-shell = "2.0.0-beta"
7
serde = { version = "1", features = ["derive"] }
8
serde_json = "1"
9
twitch-irc = "5.0.1"

Next, we'll add basic functionality for connecting to a target channel and reading the chat. The code is mostly just copied from the example on the twitch-irc README.

The setup itself is meant to be called within an async task, and it will also spawn a second async task which bubble data up to our handler

1
use tauri::{async_runtime, EventTarget};
2
use tauri::{AppHandle, Manager};
3
use twitch_irc::login::StaticLoginCredentials;
4
use twitch_irc::message::ServerMessage;
5
use twitch_irc::ClientConfig;
6
use twitch_irc::SecureTCPTransport;
7
use twitch_irc::TwitchIRCClient;
8

9
#[derive(Clone, serde::Serialize)]
10
struct TwitchMessagePayload {
11
    id: String,
12
    text: String,
13
    sender: String,
14
    channel: String,
15
}
16

17
pub async fn monitor_chat(
18
    handle: &AppHandle,
19
    target_channel: String,
20
) -> Result<(), Box<(dyn std::error::Error + 'static)>> {
21
    let config = ClientConfig::default();
22
    let (mut incoming_messages, client) =
23
        TwitchIRCClient::<SecureTCPTransport, StaticLoginCredentials>::new(config);
24

25
    let twitch_chat_app_handler = handle.clone();
26
    let join_handle = async_runtime::spawn(async move {
27
        let handler = twitch_chat_app_handler;
28
        while let Some(message) = incoming_messages.recv().await {
29
            match message {
30
                ServerMessage::Privmsg(msg) => {
31
                    let payload = TwitchMessagePayload {
32
                        id: msg.message_id,
33
                        text: msg.message_text,
34
                        sender: msg.sender.login,
35
                        channel: msg.channel_login,
36
                    };
37
                    let _result =
38
                        handler.emit_to(EventTarget::app(), "twitch_message_received", payload);
39
                }
40
                _ => {}
41
            }
42
        }
43
    });
44

45
    client.join(target_channel).unwrap();
46
    join_handle.await.unwrap();
47
    Ok(())
48
}

Lastly, we're going to have tauri run this function. We could have the function run right when tauri starts up, but we'll more than likely miss the first set of messages before the frontend instantiates.

An approach I took was to have the frontend send a message to the backend to start the chat monitor. This way, we can ensure that the frontend is ready to receive messages.

More explicitly, when the frontend finishes loading it sends an event called track_stream to the backend with the streamer's name. The backend then starts the chat monitor for that streamer and sends messages to the frontend via the twitch_message_received event that we defined earlier.

1
// Prevents additional console window on Windows in release, DO NOT REMOVE!!
2
#![cfg_attr(not(debug_assertions), windows_subsystem = "windows")]
3

4
use serde::{Deserialize, Serialize};
5
use tauri::Manager;
6
mod chat;
7

8
#[derive(Serialize, Deserialize, Debug)]
9
struct StartStreamPayload {
10
    streamer: String
11
}
12

13
fn main() {
14
    tauri::Builder::default()
15
        .setup(|app| {
16
            #[cfg(debug_assertions)] // only include this code on debug builds
17
            {
18
                let window = app.get_webview_window("main").unwrap();
19
                window.open_devtools();
20
                window.close_devtools();
21
            }
22
            let handler_clone = app.handle().clone();
23
            app.listen("track_stream", move |event| {
24
                let handler_clone = handler_clone.to_owned();
25
                let payload = event.payload();
26
                let parsed_payload: StartStreamPayload = serde_json::from_str(payload)
27
                .expect("Could not parse track stream payload");
28
                tauri::async_runtime::spawn(async move {
29
                    let _result = chat::setup_chat_monitor(&handler_clone, parsed_payload.streamer).await;
30
                });
31
            });
32
            Ok(())
33
        })
34
        .plugin(tauri_plugin_shell::init())
35
        .run(tauri::generate_context!())
36
        .expect("error while running tauri application");
37
}
38

Displaying Data with Frontend

The frontend is fairly standard React code. When the app starts, it sends a message to the backend to start the chat monitor via the track_stream event.

That starts the chat monitor, and the backend sends messages to the frontend via the twitch_message_received event. The frontend listens for this event and updates the chat window with the new message.

1
type Message = {
2
  id: string;
3
  text: string;
4
  sender: Sender;
5
  channel: string;
6
};
7

8
export const App = () => {
9
  const [targetStream, setTargetStream] = useState<string>("");
10
  const [messages, setMessages] = useState<Message[]>([]);
11
  useEffect(() => {
12
    if (!targetStream) {
13
      return;
14
    }
15
    // Initialize the stream and start listening for messages
16
    emit("track_stream", { streamer: targetStream }).catch((err) =>
17
      console.error(`could not track stream ${err}`),
18
    );
19
    const unlisten = listen<Message>("twitch_message_received", (event) => {
20
      const message: Message = {
21
        id: event.payload.id,
22
        text: event.payload.text,
23
        sender: event.payload.sender,
24
        channel: event.payload.channel,
25
      };
26
      setMessages((prevMsgs) => {
27
        return [...prevMsgs, message];
28
      });
29
    }).catch((err) => console.error(`could not receive messages ${err}`));
30

31
    return () => {
32
      // @ts-ignore
33
      unlisten.then((f) => f());
34
    };
35
  }, [targetStream]);
36

37
  return (
38
    <div>
39
      <input
40
        type="text"
41
        value={targetStream}
42
        onChange={(e) => setTargetStream(e.target.value)}
43
      />
44
      <div>
45
        {messages.map((msg) => (
46
          <div key={msg.id}>
47
            <p>{msg.sender}</p>
48
            <p>{msg.text}</p>
49
          </div>
50
        ))}
51
      </div>
52
    </div>
53
  );
54
  )
55
}

Caveat: The code above is simplified, and specifically doesn't cover stopping the chat monitor when the user changes the streamer.

Conclusion

And that's it! We've successfully set up a Twitch chat monitor using Rust and Tauri. We've covered how to set up the backend to listen for messages and how to send messages to the frontend. We've also covered how to set up the frontend to listen for messages and update the UI accordingly.