Tauri demo app

This app was built by me to test tauri. With ~16 hours of work googling and copy pasting I have been able to achieve following with a React+Rust code.

  • State managed by rust
  • API calls made by rust and front-end gets to show it
  • Storing data in an embdded sqlite db
  • embedding migrations to the exe so that we can ship single binary, with all migrations included

Development Log

Setting up the project

https://kent.medium.com/get-started-making-desktop-apps-using-rust-and-react-78a7e07433ce

just create a react app, then init rust code inside it. update npm run scripts and you are good to go.

Basics of Tauri

  • Tauri uses WebView for renderig HTML UI.
  • WebView and main rust application are different processes.
  • To communicate from UI to backend(rust process) we have 2 method,
    • Events – events are like one-way notification
    • Commands – these are rust functions annotate with #[tauri_command], and registered with .invoke_handler(tauri::generate_handler![]) method. After this you can call these commands from JS as invoke('update_count', { update: 1 }).then((c: any) => setCount(c))

Shared State in Tauri

You can create a shared variable, that can be accessed in all commands. This can useful to store a multitude of values

  • Business logic specific varibles.
  • Database connection
  • Variable shared between multiple windows(yes you can have multiple windows and do crazy message passing).

Example –

struct AppState {
    count: Mutex<i64>,
}

#[tauri::command]
fn get_count(state: tauri::State<AppState>) -> i64 {
    state.count.lock().unwrap().clone()
}

#[tauri::command]
fn update_count(update: i64, state: tauri::State<AppState>) -> i64 {
    let mut cnt = state.count.lock().unwrap();
    *cnt += update;
    cnt.clone()
}

fn main(){
  let state = AppState {
    count: Default::default(),
  };

  tauri::Builder::default()
    .manage(state)
    .invoke_handler(tauri::generate_handler![
        get_count,
        update_count,
    ])
    .run(tauri::generate_context!())
    .expect("error while running tauri application");
}

API Calling

reqwest is the HTTP client in rust which makes API calling pretty easy.

#[tauri::command]
async fn get_subreddit(sub: String) -> String {
    println!("{}", sub);
    let url = format!("https://reddit.com/r/{}.json", sub);
    let res = reqwest::get(url);
    let body = res.await;
    if body.is_err() {
        return String::from("");
    }
    let unwrapped = body.unwrap();
    let text = unwrapped.text();
    let body = text.await;
    if body.is_err() {
        return String::from("");
    }
    let return_val = body.unwrap();
    return return_val;
}

Storing data in SQLite database

Due to less number of tutorials and libraries, I found this pretty tricky. diesel is the SQL ORM I used.

Installation
  1. Add deps to Cargo.toml

diesel = { version = "1.4.0", features = ["sqlite"] }
dotenv = "0.10"
  1. Install diesel_cli, cargo install diesel_cli --no-default-features --features "sqlite-bundled"

Note: sqlite-bundled means precompiled binary included, compiling custom lib in windows can be a bit tricky.

  1. Add Database URL to .env
  2. $ diesel setup
  3. $ diesel migration generate create_todos_table
  4. Put below code in newly generated migration files

// up.sql
CREATE TABLE todos (
  id INTEGER NOT NULL PRIMARY KEY,
  title VARCHAR NOT NULL,
  body TEXT NOT NULL DEFAULT '',
  done BOOLEAN NOT NULL DEFAULT 'f'
);

// down.sql
DROP TABLE todos;
  1. $ diesel migration run
  2. create your models in src/db/models.rs

use crate::schema::todos; // refers to the schema file generated by diesel
use serde::{Serialize, Deserialize}; // makes object of this class json serializable, we will convert these objects to json and send to user. 

#[derive(Queryable, Serialize, Debug)] // these annotation adds extra functionality to objects of this struct, Debug is for printing in console `dbg!(todo)`
pub struct Todo {
    pub id: i32,
    pub title: String,
    pub body: String,
    pub done: bool,
}


#[derive(Insertable, Serialize, Debug, Clone)]
#[table_name = "todos"]
pub struct NewTodo<'a> {  // this struct will be use when inserting into the db, a struct can be Queryable and Insertable at the same time too. 
    pub title: &'a str,
    pub body: &'a str,
}
  1. Create connection and query modules

extern crate dotenv;

pub mod models;
use crate::schema::*;
use diesel::prelude::*;
use dotenv::dotenv;
use models::{NewTodo, Todo};
use std::env;

pub fn establish_connection() -> SqliteConnection { // creates a new connection to the DB and returns reference
    dotenv().ok();

    let database_url = env::var("DATABASE_URL").expect("DATABASE_URL must be set");
    SqliteConnection::establish(&database_url)
        .unwrap_or_else(|_| panic!("Error connecting to {}", database_url))
}

DB query and Serialization

  1. Insert

  new_todo = NewTodo { title, body };
  let todo = diesel::insert_into(todos::table)
      .values(&new_todo)
      .execute(conn)
      .expect("Error saving new post");
  let todo_json  =serde_json::to_string(&todo).unwrap();
  todo_json
  1. SELECT * FROM todos;

  let all_todos = todos::dsl::todos
      .load::<Todo>(conn)
      .expect("Expect loading posts");
  let serialized = serde_json::to_string(&all_todos).unwrap();
  serialized
  1. Delete

  use todos::dsl::{ id};
  let t = todos::dsl::todos.filter(id.eq(&qid));
  diesel::delete(t)
      .execute(conn)
      .expect("error deleting todo");
  1. Update

  use todos::dsl::{done, id};
  diesel::update(todos::dsl::todos.filter(id.eq(&qid)))
      .set(done.eq(!t.done))
      .execute(conn)
      .expect("Error updating");
  let updated = todos::dsl::todos
      .filter(id.eq(&qid))
      .first::<Todo>(conn)
      .expect("Todo not found");
  serde_json::to_string(&updated).unwrap()

In above we are first updating a row then doing another query to read it. This is because SQLite doesn’t support sending updated rows/ids as return value of update statement.

Building the bin

Add dep libsqlite3-sys = { version = "0.9.1", features = ["bundled"] } the sqlite lib we downloaded was for CLI, to build a binary we need to add this dep.

$ npm run build
$ npm run tauri build

This will create the binary but you still have one problem, newly create sqlite file won’t have the tables, so you’ll have to copy it in same directory as exe.

Embedded Migration

To migrate sqlite file, you need migrations file and cargo cli to run migrations, diesel_migrations will do both for us

# Cargo.toml
diesel_migrations = { version = "1.4.0", features = ["sqlite"] }

// main.rs
#[macro_use]
extern crate diesel;
#[macro_use] 
extern crate diesel_migrations;
embed_migrations!("./migrations/");
.........

fn main(){
  let conn = db::establish_connection();
  diesel_migrations::run_pending_migrations(&conn).expect("Error migrating");
  ..........
}
  • embed_migrations! will scan given folder for migration files and will add all those to final build (exe)
  • diesel_migrations::run_pending_migrations(&conn) will check the db for files

Fix rust rebuilding on each db commit

every time you will write to the db your rust application will restart, this is becase changing any file in src-tauri library will trigger a rebuild

in .env file add TAURI_DEV_WATCHER_IGNORE_FILE=.taurignore and in .taruignore add store.sqlite

Future topics to explore

  • Publishing apps with updates and auto-tagging
  • automatic ts defination generation for tauri-commands
  • refactoring the backend into seperate plugins
  • Directly accessing sqlite db from node giving the fs access.
  • Exploring OS specific UI like notification and menu.
  • Exploring react libraries for making cross-platform UI
  • Checking if I can use transparent, glass materials of windows UI.
  • Cross Platform Compilation with docker
  • Building for Android and iOS

Some Sample App Ideas to work on

  • File Converter
  • Youtube Video Downloader
  • Reddit scrapper

GitHub

View Github