# Overview Fauna is a truly serverless database that combines document flexibility with native relational capabilities, offering auto-scaling, multi-active replication, and HTTPS connectivity. ## [](#key-concepts-and-features)Key concepts and features ### [](#relational-model)Relational model Fauna’s data model integrates the best aspects of document and relational databases. Like other document databases, data is stored in JSON-like documents, allowing for the storage of unstructured data and removing the need for an object-relational mapper (ORM) to translate between objects and tables. Fauna also provides key features of relational databases including strong consistency, first-class support for relationships between documents, and the ability to layer on and enforce schema over time. [Documents](../../learn/data-model/documents/) in Fauna are organized into [collections](../../learn/data-model/collections/), similar to tables in relational databases, providing a familiar structure for data organization. Collections can specify [document types](../../learn/schema/#document-type-definitions), which define and enforce the structure of documents they contain. This feature allows developers to start with a [flexible schema](../../learn/schema/#type-enforcement) and gradually introduce more structure as their application matures. Importantly, Fauna supports [relationships between documents](../../learn/data-model/relationships/) in different collections, enabling complex data modeling without duplicating data. This approach combines the ease of use of document databases with the powerful data modeling capabilities of relational systems. ### [](#distributed-architecture)Distributed architecture Fauna is built on a distributed architecture that offers global data distribution across multiple regions. It provides strong consistency for all transactions, even across geographic regions, a feature that sets Fauna apart from many other distributed databases. ### [](#fauna-query-language)Fauna Query Language [Fauna Query Language (FQL)](../../learn/query/) is a TypeScript-inspired language designed specifically for querying and manipulating data in Fauna. It offers a concise, yet expressive syntax for relational queries, supporting complex joins and data transformations. FQL includes optional static typing to catch errors early in development, improving code quality and reducing runtime issues. One of FQL’s powerful features is the ability to create [user-defined functions (UDFs)](../../learn/schema/user-defined-functions/). These allow developers to encapsulate complex business logic directly within the database, promoting code reuse and maintaining a clear separation of concerns. Here’s an example of an FQL query: ```fql // Gets the first customer with // an email of "alice.appleseed@example.com". let customer = Customer.where(.email == "alice.appleseed@example.com") .first() // Gets the first order for the customer, // sorted by descending creation time. Order.where(.customer == customer) .order(desc(.createdAt)). first() { // Project fields from the order. // The order contains fields with document references. // Projecting the fields resolves references, // similar to a SQL join. // `Customer` document reference: customer { name, email }, status, createdAt, items { // Nested `Product` document reference: product { name, price, stock, // `Category` document reference: category { name } }, quantity }, total } ``` The query shows how FQL can succinctly express complex operations, including lookups, joins, sorting, and data projection. ### [](#fauna-schema-language)Fauna Schema Language [Fauna Schema Language (FSL)](../../learn/schema/) allows developers to define and manage database schema as code. It enables version control for schema changes, [integration with CI/CD pipelines](../../learn/schema/manage-schema/#cicd), and [progressive schema enforcement](../../learn/schema/#type-enforcement) as applications evolve. By treating database schema as code, teams can apply the same rigorous review and testing processes to database changes as they do to application code. Here’s an example of an FSL schema definition: ```fsl collection Customer { name: String email: String address: { street: String city: String state: String postalCode: String country: String } compute cart: Order? = (customer => Order.byCustomerAndStatus(customer, 'cart').first()) // Use a computed field to get the Set of Orders for a customer. compute orders: Set = ( customer => Order.byCustomer(customer)) // Use a unique constraint to ensure no two customers have the same email. unique [.email] index byEmail { terms [.email] } } ``` This schema defines a `Customer` collection with specific fields, a computed field, a uniqueness constraint, and an index. The `*: Any` [wildcard constraint](../../learn/schema/#wildcard-constraint) allows for arbitrary ad hoc fields, providing flexibility while still enforcing structure where needed. ### [](#transactions-and-consistency)Transactions and consistency In Fauna, every query is a transaction, ensuring ACID compliance across all operations, even in globally distributed region groups. Fauna’s distributed transaction engine, based on the Calvin protocol, provides strict serializability for all read-write queries and serializable isolation for read-only queries. This guarantees real-time consistency across all replicas without relying on clock synchronization, eliminating anomalies common in systems dependent on synchronized clocks. Fauna’s strong consistency model has been verified by Jepsen, an independent third-party analysis firm, confirming its ability to maintain high levels of consistency even under various failure scenarios. Despite these robust guarantees, Fauna maintains high performance and low latency, even for geographically distributed deployments, thanks to its innovative architecture that separates transaction ordering from execution. ### [](#security-and-access-control)Security and access control Fauna provides comprehensive security features to protect your data and control access. It offers [role-based access control (RBAC)](../../learn/security/roles/) for coarse-grained permissions and [attribute-based access control (ABAC)](../../learn/security/abac/) for fine-grained, dynamic access rules. This combination allows for highly flexible and precise control over who can access and modify data. The system includes [built-in user authentication](../../learn/security/authentication/) and also supports integration with [third-party identity providers](../../learn/security/authentication/), allowing you to use existing authentication systems. All data in Fauna is encrypted at rest and in transit, ensuring protection at all times. ### [](#change-data-capture-cdc-and-real-time-events)Change data capture (CDC) and real-time events Fauna’s [Change data capture (CDC)](../../learn/cdc/) feature enables real-time application features. Developers can subscribe to changes in collections or specific documents, receiving either atomic push updates using real-time [event streams](../../learn/cdc/) or on-demand batch pull updates using [event feeds](../../learn/cdc/) . Events are particularly useful for maintaining live application states, building collaborative features that require real-time synchronization, and mirroring changes into external systems. ### [](#database-model)Database model Fauna’s [database model](../../learn/data-model/databases/) makes it easy to create databases for isolated environments, such as staging and production, and build secure multi-tenant applications. For multi-tenant applications, developers can create isolated child databases for each tenant, applying separate access controls and schema as needed. The same approach can be applied to isolated environments, enabling separate databases for each environment, ensuring that changes in one environment do not affect others. This model simplifies administration, ensures clear separation between environments, and guarantees that tenants or environments cannot interfere with each other’s data. ## [](#developer-experience)Developer experience Fauna is designed to integrate seamlessly into modern development workflows. It provides official [client drivers](../../build/drivers/) for popular programming languages such as JavaScript, Python, Go, Java, and C#. The [Fauna CLI](../../build/cli/v4/) offers tools for schema management and database operations, while the Dashboard enables visual database management and query execution. Additionally, the [Fauna container image](../../build/tools/docker/) can be used for local testing and validation, even in environments where connectivity is limited. By supporting schema-as-code practices and CI/CD integration, Fauna allows development teams to treat database changes with the same rigor as application code changes. This approach promotes better collaboration, easier testing, and more reliable deployments. ## [](#comparison-to-other-databases)Comparison to other databases While Fauna shares some characteristics with both document and relational databases, it offers unique capabilities that set it apart. ### [](#document-databases-example-mongodb)Document databases (Example: MongoDB) Fauna and MongoDB are both document databases, but Fauna offers a unique combination of document flexibility with relational database features. While MongoDB provides a flexible document model and horizontal scalability, Fauna enhances this with strong consistency guarantees, built-in support for complex relationships, and more powerful querying capabilities through FQL. Fauna’s document model allows for enforcing schema when needed, supporting complex joins, and maintaining ACID compliance across distributed environments. Unlike MongoDB, Fauna provides native multi-region distribution and doesn’t require separate replication setup or sharding configurations. | See Fauna for MongoDB users | | --- | --- | --- | ### [](#key-value-stores-example-dynamodb)Key-value stores (Example: DynamoDB) Compared to key-value stores like DynamoDB, Fauna offers a richer data model and query capabilities while maintaining high performance and scalability. Fauna’s document model allows for more complex data structures and relationships, reducing the need for denormalization often required in DynamoDB. Fauna provides native support for secondary indexes without the limitations found in DynamoDB, such as the ability to index nested fields and arrays. Additionally, Fauna’s query language (FQL) allows for more sophisticated queries and data manipulations within the database, potentially reducing application-side complexity. Both databases offer serverless operations, but Fauna’s multi-region distribution is built-in, simplifying global deployment. ### [](#traditional-relational-databases-example-postgresql)Traditional relational databases (Example: PostgreSQL) While both Fauna and PostgreSQL are relational databases supporting ACID transactions, their architectures and deployment models differ significantly. PostgreSQL follows a traditional client-server model with a fixed schema, while Fauna offers a flexible document model with optional schema enforcement. Fauna’s serverless, globally distributed architecture contrasts with PostgreSQL’s typical single-region deployment, offering easier scalability and global distribution out of the box. Fauna’s FQL provides a modern, programmable query interface, whereas PostgreSQL uses standard SQL. Both support stored procedures, but Fauna’s User-Defined Functions (UDFs) are written in FQL and executed within the database’s distributed environment. Fauna also offers built-in temporality and multi-tenancy features, which typically require additional setup in PostgreSQL. | See Fauna for SQL users | | --- | --- | --- | ### [](#newsql-databases-example-cockroachdb)NewSQL databases (Example: CockroachDB) Fauna and NewSQL databases like CockroachDB both aim to combine the scalability of NoSQL systems with the strong consistency of traditional relational databases. However, Fauna’s approach differs in its serverless, API-first design. While CockroachDB focuses on providing a distributed SQL database, Fauna offers a relational document model with its own query language (FQL). Fauna’s built-in multi-tenancy and attribute-based access control provide more granular security options out of the box. Both databases offer global distribution, but Fauna’s serverless nature means users don’t need to manage nodes or worry about data placement strategies. Fauna’s [temporal query](../../learn/doc-history/#temporal-query) capabilities and [event feeds and event streams](../../learn/cdc/) are also notable features not typically found in NewSQL offerings. ### [](#summary)Summary With this combined set of features, Fauna aims to provide a unified solution that meets the diverse needs of modern application development, from rapid prototyping to global-scale production deployments. Its unique approach allows developers to build sophisticated, globally distributed applications without the operational complexity traditionally associated with such systems. # Quick starts Get Fauna up and running quickly using your preferred programming language or tool: ![Node.js](../../build/_images/integration/logos/nodejs.svg) [Node.js](nodejs/) ![Python](../../build/_images/drivers/logos/python.svg) [Python](python/) ![Go](../../build/_images/drivers/logos/golang.svg) [Go](go/) ![C#](../../build/_images/drivers/logos/csharp.svg) [.NET/C#](dotnet/) ![Java](../../build/_images/drivers/logos/java.svg) [Java](java/) ![Ruby](../_images/logos/ruby.svg) [Ruby](ruby/) ![Rust](../_images/logos/rust.svg) [Rust](rust/) ![Swift](../_images/logos/swift.svg) [Swift](swift/) ![HTTP API](../../build/_images/integration/logos/http-api.svg) [HTTP API](http-api/) ![Script tag](../../build/_images/integration/logos/html5.svg) [Script tag](script-tag/) ![Cloudflare](../../build/_images/integration/logos/cloudflare.svg) [Cloudflare Workers](cloudflare/) ![AWS serverless services](../../build/_images/integration/logos/aws.png) [AWS Lambda](aws-lambda/) # Node.js quick start Use Fauna’s [JavaScript driver](../../../build/drivers/js-client/) to query e-commerce demo data in a Node.js app. The app requires Node.js v18 or later and npm. 1. **Create a database with demo data** 2. **Create an authentication secret** 3. **Set the FAUNA\_SECRET environment variable** Set the `FAUNA_SECRET` environment variable to your key’s secret. Fauna’s client drivers can access the secret from this variable. ```bash export FAUNA_SECRET= ``` 4. **Install the JavaScript driver** Create a new directory for your app and install the [JavaScript driver](../../../build/drivers/js-client/): ```bash mkdir app cd app npm install fauna ``` 5. **Create a basic app** In the `app` directory, create an `app.mjs` file and add the following code: ```javascript import { Client, fql, FaunaError } from "fauna"; // Use `require` for CommonJS: // const { Client, fql, FaunaError } = require('fauna'); // Initialize the client to connect to Fauna // using the `FAUNA_SECRET` environment variable. const client = new Client(); try { // Compose a query using an FQL template string. // The query calls the `Product` collection's // `sortedByPriceLowToHigh()` index. It projects the `name`, // `description`, and `price` fields covered by the index. const query = fql` Product.sortedByPriceLowToHigh() { name, description, price }`; // Run the query. const pages = client.paginate(query); // Iterate through the results. const products = []; for await (const product of pages.flatten()) { products.push(product); } console.log(products); } catch (error) { if (error instanceof FaunaError) { console.log(error); } } finally { client.close(); } ``` 6. **Run the app** Run the script from the `app` directory. The script prints a list of e-commerce products from the demo data in the terminal. ```bash node app.mjs ``` # Python quick start Use Fauna’s [Python driver](../../../build/drivers/py-client/) to query e-commerce demo data in a Python app. The driver requires Python 3.9 or later. 1. **Create a database with demo data** 2. **Create an authentication secret** 3. **Set the FAUNA\_SECRET environment variable** Set the `FAUNA_SECRET` environment variable to your key’s secret. Fauna’s client drivers can access the secret from this variable. ```bash export FAUNA_SECRET= ``` 4. **Install the Python driver** Create a new directory for your app and install the [Python driver](../../../build/drivers/py-client/): ```bash mkdir app cd app pip install fauna ``` 5. **Create a basic app** In the `app` directory, create an `app.py` file and add the following code: ```python from fauna import fql from fauna.client import Client from fauna.encoding import QuerySuccess from fauna.errors import FaunaException # Initialize the client to connect to Fauna # using the `FAUNA_SECRET` environment variable. client = Client() try: # Compose a query using an FQL template string. # The query calls the `Product` collection's # `sortedByPriceLowToHigh()` index. It projects the `name`, # `description`, and `price` fields covered by the index. query = fql( """ Product.sortedByPriceLowToHigh() { name, description, price }""" ) # Run the query. pages = client.paginate(query) # Iterate through the results. for products in pages: for product in products: print(product) except FaunaException as e: print(e) finally: client.close() ``` 6. **Run the app** Run the script from the `app` directory. The script prints a list of e-commerce products from the demo data in the terminal. ```bash python app.py ``` # Go quick start Use Fauna’s [Go driver](../../../build/drivers/go-client/) to query e-commerce demo data in a Go app. The driver requires Go 1.19 or later. 1. **Create a database with demo data** 2. **Create an authentication secret** 3. **Set the FAUNA\_SECRET environment variable** Set the `FAUNA_SECRET` environment variable to your key’s secret. Fauna’s client drivers can access the secret from this variable. ```bash export FAUNA_SECRET= ``` 4. **Install the Go driver** Create a new directory for your app and install the [Go driver](../../../build/drivers/go-client/): ```bash mkdir app cd app go mod init app go get github.com/fauna/fauna-go/v3 ``` 5. **Create a basic app** In the `app` directory, create an `app.go` file and add the following code: ```go package main import ( "fmt" "github.com/fauna/fauna-go/v3" ) func main() { // Initialize the client to connect to Fauna // using the `FAUNA_SECRET` environment variable. client, clientErr := fauna.NewDefaultClient() if clientErr != nil { panic(clientErr) } // Compose a query using an FQL template string. // The query calls the `Product` collection's // `sortedByPriceLowToHigh()` index. It projects the `name`, // `description`, and `price` fields covered by the index. query, _ := fauna.FQL(` Product.sortedByPriceLowToHigh() { name, description, price } `, nil) // Run the query. paginator := client.Paginate(query) // Iterate through the results. for { page, _ := paginator.Next() var pageItems []any page.Unmarshal(&pageItems) for _, item := range pageItems { jsonData, _ := json.Marshal(item) fmt.Println(string(jsonData)) } if !paginator.HasNext() { break } } } ``` 6. **Run the app** Run the script from the `app` directory. The script prints a list of e-commerce products from the demo data in the terminal. ```bash go run app.go ``` # .NET/C# quick start Use Fauna’s [.NET/C# driver](../../../build/drivers/dotnet-client/) to query e-commerce demo data in a .NET app. The driver requires .NET 8.0 or later. 1. **Create a database with demo data** 2. **Create an authentication secret** 3. **Set the FAUNA\_SECRET environment variable** Set the `FAUNA_SECRET` environment variable to your key’s secret. Fauna’s client drivers can access the secret from this variable. ```bash export FAUNA_SECRET= ``` 4. **Install the .NET/C# driver** Create a new directory for your app and install the [.NET/C# driver](../../../build/drivers/dotnet-client/): ```bash mkdir app cd app dotnet new console dotnet add package Fauna ``` 5. **Create a basic app** In the `app` directory, edit the `Program.cs` file and replace the code with the following: ```csharp using Fauna; using Fauna.Exceptions; using static Fauna.Query; try { // Initialize the client to connect to Fauna // using the `FAUNA_SECRET` environment variable. var client = new Client(); // Compose a query using an FQL template string. // The query calls the `Product` collection's // `sortedByPriceLowToHigh()` index. It projects the `name`, // `description`, and `price` fields covered by the index. var query = FQL($@" Product.sortedByPriceLowToHigh() {{ name, description, price }} "); // Run the query. var response = client.PaginateAsync>(query); await foreach (var page in response) { foreach (var product in page.Data) { Console.WriteLine(product["name"]); Console.WriteLine(product["description"]); Console.WriteLine(product["price"]); Console.WriteLine("--------"); } } } catch (FaunaException e) { Console.WriteLine(e); } ``` 6. **Run the app** Run the script from the `app` directory. The script prints a list of e-commerce products from the demo data in the terminal. ```bash dotnet run ``` # Java quick start Use Fauna’s [JVM driver](../../../build/drivers/jvm-client/) to query e-commerce demo data in a Java 17 app. You’ll set up the app using [Gradle](https://gradle.org/install/). 1. **Create a database with demo data** 2. **Create an authentication secret** 3. **Set the FAUNA\_SECRET environment variable** Set the `FAUNA_SECRET` environment variable to your key’s secret. Fauna’s client drivers can access the secret from this variable. ```bash export FAUNA_SECRET= ``` 4. **Create a Gradle project** Create a new directory for your app and initialize a Gradle project: ```bash mkdir example cd example gradle init \ --type java-application \ --dsl groovy \ --test-framework junit \ --project-name app \ --package app \ --java-version 17 \ --no-incubating \ --no-split-project ``` 5. **Add the JVM driver as a dependency** In the `example/app` directory, edit `example/app/build.gradle` and add the [JVM driver](../../../build/drivers/jvm-client/) as a dependency: ```groovy dependencies { ... implementation "com.fauna:fauna-jvm:1.0.0" ... } ``` 6. **Create a basic app** In the `example/app` directory, edit the `example/app/src/main/java/app/App.java` file and replace the code with the following: ```java package app; import java.util.concurrent.CompletableFuture; import java.util.concurrent.ExecutionException; import com.fauna.client.Fauna; import com.fauna.client.FaunaClient; import com.fauna.exception.FaunaException; import com.fauna.query.builder.Query; import com.fauna.response.QuerySuccess; import com.fauna.types.Page; import static com.fauna.codec.Generic.pageOf; import static com.fauna.query.builder.Query.fql; public class App { // Define class for `Product` documents // in expected results. public static class Product { public String name; public String description; public Integer price; } public static void main(String[] args) { try { // Initialize the client to connect to Fauna // using the `FAUNA_SECRET` environment variable. FaunaClient client = Fauna.client(); // Compose a query using an FQL template string. // The query calls the `Product` collection's // `sortedByPriceLowToHigh()` index. It projects the `name`, // `description`, and `price` fields covered by the index. Query query = fql(""" Product.sortedByPriceLowToHigh() { name, description, price } """); // Run the query. System.out.println("Running synchronous query:"); runSynchronousQuery(client, query); } catch (FaunaException e) { System.err.println("Fauna error occurred: " + e.getMessage()); e.printStackTrace(); } } private static void runSynchronousQuery(FaunaClient client, Query query) throws FaunaException { // Use `query()` to run a synchronous query. // Synchronous queries block the current thread until the query completes. // Accepts the query, expected result class, and a nullable set of query options. QuerySuccess> result = client.query(query, pageOf(Product.class)); printResults(result.getData()); } // Iterate through the products in the page. private static void printResults(Page page) { for (Product product : page.getData()) { System.out.println("Name: " + product.name); System.out.println("Description: " + product.description); System.out.println("Price: " + product.price); System.out.println("--------"); } // Print the `after` cursor to paginate through results. System.out.println("After: " + page.getAfter()); } } ``` 7. **Run the app** Run the script from the `example` directory. The script prints a list of e-commerce products from the demo data in the terminal. ```bash ./gradlew run ``` # Ruby quick start Use the [Fauna Core HTTP API](../../../reference/http/reference/core-api/) to query e-commerce demo data using Ruby’s [Net::HTTP](https://github.com/ruby/net-http) client library. 1. **Create a database with demo data** 2. **Create an authentication secret** 3. **Set the FAUNA\_SECRET environment variable** Set the `FAUNA_SECRET` environment variable to your key’s secret. ```bash export FAUNA_SECRET= ``` 4. **Set up an app** Create and navigate to a new directory for your app; ```bash mkdir app cd app ``` 5. **Create a basic app** In the `app` directory, create an `app.rb` file and add the following code: ```ruby require 'net/http' require 'uri' require 'json' # Define the Query API endpoint and headers. uri = URI("https://db.fauna.com/query/1") headers = { "Authorization" => "Bearer #{ENV['FAUNA_SECRET']}", "Content-Type" => "application/json", "X-Format" => "simple" } # Define the FQL query. query_payload = { query: "Product.sortedByPriceLowToHigh() { name, description, price }" } # Make the HTTP request. http = Net::HTTP.new(uri.host, uri.port) http.use_ssl = true request = Net::HTTP::Post.new(uri.path, headers) request.body = query_payload.to_json response = http.request(request) # Output the response. puts JSON.pretty_generate(JSON.parse(response.body)) ``` 6. **Run the app** Run the app from the `app` directory. ```bash ruby app.rb ``` The app prints the Query API’s response. The `data` property contains the results of the query: ``` { "data": { "data": [ { "name": "single lime", "description": "Conventional, 1 ct", "price": 35 }, { "name": "cilantro", "description": "Organic, 1 bunch", "price": 149 }, { "name": "limes", "description": "Conventional, 16 oz bag", "price": 299 }, { "name": "organic limes", "description": "Organic, 16 oz bag", "price": 349 }, { "name": "avocados", "description": "Conventional Hass, 4ct bag", "price": 399 }, { "name": "pizza", "description": "Frozen Cheese", "price": 499 }, { "name": "cups", "description": "Translucent 9 Oz, 100 ct", "price": 698 }, { "name": "taco pinata", "description": "Giant Taco Pinata", "price": 2399 }, { "name": "donkey pinata", "description": "Original Classic Donkey Pinata", "price": 2499 } ] }, ... } ``` # Rust quick start Use the [Fauna Core HTTP API](../../../reference/http/reference/core-api/) to query e-commerce demo data using Rust’s [reqwest](https://docs.rs/reqwest/latest/reqwest/) HTTP client library. 1. **Create a database with demo data** 2. **Create an authentication secret** 3. **Set the FAUNA\_SECRET environment variable** Set the `FAUNA_SECRET` environment variable to your key’s secret. ```bash export FAUNA_SECRET= ``` 4. **Set up a Rust project** Create and navigate to a new directory for your app. Then initialize a Rust project: ```bash mkdir app cd app cargo init ``` In the `app` directory, add the following dependencies to `Cargo.toml`: ```toml # Cargo.tml ... [dependencies] # HTTP client reqwest = { version = "0.11", features = ["json"] } # Lets you run async tasks tokio = { version = "1", features = ["full"] } # Serializer/deserializer for JSON serde = { version = "1.0", features = ["derive"] } serde_json = "1.0" ``` 5. **Create a basic app** In the `app` directory, replace the content of `src/main.rs` with the following code: ```rust use reqwest::{Client, header}; use std::env; use serde_json::json; #[tokio::main] async fn main() { let fauna_secret = env::var("FAUNA_SECRET") .expect("FAUNA_SECRET environment variable is not set"); // Define the Query API endpoint. let url = "https://db.fauna.com/query/1"; let client = Client::new(); // Define the headers. let mut headers = header::HeaderMap::new(); headers.insert("Authorization", format!("Bearer {}", fauna_secret).parse().unwrap()); headers.insert("Content-Type", "application/json".parse().unwrap()); headers.insert("X-Format", "simple".parse().unwrap()); // Define the FQL query. let query_payload = json!({ "query": "Product.sortedByPriceLowToHigh() { name, description, price }" }); // Make the HTTP request. let response = client.post(url) .headers(headers) .json(&query_payload) .send() .await .expect("Failed to send the request"); // Output the response. match response.json::().await { Ok(json) => println!("{}", serde_json::to_string_pretty(&json).unwrap()), Err(err) => eprintln!("Failed to parse response: {}", err), } } ``` 6. **Create a basic app** Run the app from the `app` directory. ```bash cargo run ``` The app prints the Query API’s response. The `data` property contains the results of the query: ``` { "data": { "data": [ { "description": "Conventional, 1 ct", "name": "single lime", "price": 35 }, { "description": "Organic, 1 bunch", "name": "cilantro", "price": 149 }, { "description": "Conventional, 16 oz bag", "name": "limes", "price": 299 }, { "description": "Organic, 16 oz bag", "name": "organic limes", "price": 349 }, { "description": "Conventional Hass, 4ct bag", "name": "avocados", "price": 399 }, { "description": "Frozen Cheese", "name": "pizza", "price": 499 }, { "description": "Translucent 9 Oz, 100 ct", "name": "cups", "price": 698 }, { "description": "Giant Taco Pinata", "name": "taco pinata", "price": 2399 }, { "description": "Original Classic Donkey Pinata", "name": "donkey pinata", "price": 2499 } ] }, ... } ``` # Swift quick start Use the [Fauna Core HTTP API](../../../reference/http/reference/core-api/) to query e-commerce demo data using Swift. 1. **Create a database with demo data** 2. **Create an authentication secret** 3. **Set the FAUNA\_SECRET environment variable** Set the `FAUNA_SECRET` environment variable to your key’s secret. ```bash export FAUNA_SECRET= ``` 4. **Set up a Swift project** Create and navigate to a new directory for your app. Then initialize a Swift project: ```bash mkdir app cd app swift package init --type executable ``` 5. **Create a basic app** In the `app` directory, replace the content of `Sources/app/main.swift` with the following code: ```swift import Foundation // Define the Query API endpoint and headers. let url = URL(string: "https://db.fauna.com/query/1")! let faunaSecret = ProcessInfo.processInfo.environment["FAUNA_SECRET"]! var request = URLRequest(url: url) request.httpMethod = "POST" request.addValue("Bearer \(faunaSecret)", forHTTPHeaderField: "Authorization") request.addValue("application/json", forHTTPHeaderField: "Content-Type") request.addValue("simple", forHTTPHeaderField: "X-Format") // Define the FQL query. let queryPayload: [String: Any] = [ "query": "Product.sortedByPriceLowToHigh() { name, description, price }" ] do { // Set the request body with query. let jsonData = try JSONSerialization.data(withJSONObject: queryPayload, options: []) request.httpBody = jsonData } catch { print("Failed to encode the JSON body: \(error)") exit(1) } // Make the HTTP request. let task = URLSession.shared.dataTask(with: request) { data, response, error in if let error = error { print("Error: \(error)") return } guard let data = data else { print("No data received") return } do { // Output the response. let json = try JSONSerialization.jsonObject(with: data, options: []) as! [String: Any] let prettyJsonData = try JSONSerialization.data(withJSONObject: json, options: .prettyPrinted) if let prettyJson = String(data: prettyJsonData, encoding: .utf8) { print(prettyJson) } } catch { print("Failed to parse response: \(error)") } } // Start the task. task.resume() // Keep the program running to wait for the async response. RunLoop.main.run() ``` 6. **Run the app** Run the app from the `app` directory. ```bash swift run ``` The app prints the Query API’s response. The `data` property contains the results of the query: ``` { "summary" : "", "data" : { "data" : [ { "name" : "single lime", "price" : 35, "description" : "Conventional, 1 ct" }, { "name" : "cilantro", "price" : 149, "description" : "Organic, 1 bunch" }, { "name" : "limes", "price" : 299, "description" : "Conventional, 16 oz bag" }, { "name" : "organic limes", "price" : 349, "description" : "Organic, 16 oz bag" }, { "name" : "avocados", "price" : 399, "description" : "Conventional Hass, 4ct bag" }, { "name" : "pizza", "price" : 499, "description" : "Frozen Cheese" }, { "name" : "cups", "price" : 698, "description" : "Translucent 9 Oz, 100 ct" }, { "name" : "taco pinata", "price" : 2399, "description" : "Giant Taco Pinata" }, { "name" : "donkey pinata", "price" : 2499, "description" : "Original Classic Donkey Pinata" } ] }, ... } ``` # HTTP API quick start Use the [Fauna Core HTTP API](../../../reference/http/reference/core-api/) to query e-commerce demo data using curl or any client that can make HTTP requests. 1. **Create a database with demo data** 2. **Create an authentication secret** 3. **Set the FAUNA\_SECRET environment variable** Set the `FAUNA_SECRET` environment variable to your key’s secret. ```bash export FAUNA_SECRET= ``` 4. **Make an HTTP request** To run a query, use curl to make a request to the Core HTTP API’s [Query endpoint](../../../reference/http/reference/core-api/#operation/query). Fauna routes and authorizes the request using the secret in `FAUNA_SECRET` environment variable. The secret is scoped to a specific database. The query only runs in this database. ```bash curl -X POST \ "https://db.fauna.com/query/1" \ -H "Authorization: Bearer $FAUNA_SECRET" \ -H "Content-Type: application/json" \ -H "X-Format: simple" \ -d '{ "query": "Product.sortedByPriceLowToHigh() { name, description, price }" }' ``` The response’s `data` property contains the results of the query: ``` { "data": { "data": [ { "name": "single lime", "description": "Conventional, 1 ct", "price": 35 }, { "name": "cilantro", "description": "Organic, 1 bunch", "price": 149 }, { "name": "limes", "description": "Conventional, 16 oz bag", "price": 299 }, { "name": "organic limes", "description": "Organic, 16 oz bag", "price": 349 }, { "name": "avocados", "description": "Conventional Hass, 4ct bag", "price": 399 }, { "name": "pizza", "description": "Frozen Cheese", "price": 499 }, { "name": "cups", "description": "Translucent 9 Oz, 100 ct", "price": 698 }, { "name": "taco pinata", "description": "Giant Taco Pinata", "price": 2399 }, { "name": "donkey pinata", "description": "Original Classic Donkey Pinata", "price": 2499 } ] }, ... } ``` # Script tag quick start Use Fauna’s [JavaScript driver](../../../build/drivers/js-client/) to query e-commerce demo data in client-side HTML. 1. **Create a database with demo data** 2. **Create an authentication secret** 3. **Create a webpage** Create an `index.html` file and add the following code. Replace `KEY_SECRET` with your copied key secret. ```html Fauna Product List

Product List

Name Description Price
``` 4. **Open the webpage** Open the `index.html` file in a web browser. The page displays a table of e-commerce products from the demo data. ![HTML page example](../../_images/script-tag-page-ex.png) # Cloudflare Workers quick start Use Fauna’s [JavaScript driver](../../../build/drivers/js-client/) to query e-commerce demo data using a [Cloudflare Worker](https://workers.cloudflare.com/). The Worker uses TypeScript. You’ll set up the Worker using the [Cloudflare Wrangler CLI](https://developers.cloudflare.com/workers/wrangler/install-and-update/) and the [Fauna Cloudflare integration](https://developers.cloudflare.com/workers/databases/native-integrations/fauna/). 1. **Create a database with demo data** 2. **Create and deploy a Cloudflare Worker** In your terminal: * Install the Wrangler CLI. * Create a Cloudflare Worker named `my-fauna-worker`. * Install the Fauna [JavaScript driver](../../../build/drivers/js-client/). * Deploy the Cloudflare Worker. ```bash npm install -g wrangler@latest npm create cloudflare my-fauna-worker -- \ --category hello-world \ --type hello-world \ --lang ts \ --git false \ --deploy true cd my-fauna-worker npm install fauna ``` 3. **Set up the Fauna integration** 1. In your browser, log in to the [Cloudflare dashboard](https://dash.cloudflare.com/) and select your account. 2. In **Account Home**, select **Workers & Pages**. 3. In the **Workers** tab, select the **my-fauna-worker** Worker. 4. Select **Integrations > Fauna**. 5. If prompted, log in to Fauna and authorize the integration. 6. Follow the integration setup flow and select: * Your Fauna database * The **server-readonly** database role * `FAUNA_SECRET` as the **Secret Name**. 7. Click **Finish**. ![Fauna Cloudflare integration](../../_images/cloud-flare-integration-ex.png) 4. **Edit the Worker’s code** In your local `my-fauna-worker` directory, edit the `src/index.ts` file and replace the code with the following: ```typescript import { Client, fql, FaunaError, ServiceError } from "fauna"; // This interface defines the structure of the environment // variables that get passed in. export interface Env { FAUNA_SECRET: string; } export default { async fetch(request, env, ctx): Promise { // 1. Initialize a Fauna client using the FAUNA_SECRET env var. const client = new Client({ secret: env.FAUNA_SECRET }); // Extract the email from the query string. Otherwise, use an // email address that's in the Fauna sample data. const url = new URL(request.url); const email = url.searchParams.get("email") || "alice.appleseed@example.com"; try { // 2. Execute an FQL query to retrieve data from Fauna. // In this example, it queries 'Customer.byEmail(...)' for // a particular user. 'first()' ensures only the first // matching result is returned as email has a unique // constraint on it in this collection. const getData = await client.query( fql`Customer.byEmail(${email}).first()` ); // 3. Return the retrieved data as a JSON response. return new Response( JSON.stringify(getData), { status: 200 } ); } catch (error) { // 4. Handle Fauna-specific errors separately from // other errors. if (error instanceof FaunaError) { // If it's a service error (e.g., a problem with the // query or the Fauna service itself), log the Fauna // queryInfo summary to the console for debugging. if (error instanceof ServiceError) { console.error(error.queryInfo?.summary); } else { // Otherwise, return a generic error response for // Fauna errors. return new Response( "Error " + error, { status: 500 } ); } } // 5. For any other error, return a less specific // message to prevent leaking internal error // details to the caller. return new Response( "An error occurred, " + error.message, { status: 500, }); } }, } satisfies ExportedHandler; ``` 5. **Re-deploy and test the Worker** In your local `my-fauna-worker` directory, use the Wrangler CLI to re-deploy the app. Then use `curl` to make a `GET` HTTP request to the URL from the deployment output. The Worker returns a query response containing sample e-commerce customer from the Fauna demo data. ```bash curl -X GET $(wrangler deploy | grep -o 'https://[^ ]*') | jq . ``` ``` { "data": { "coll": { "name": "Customer" }, "id": "111", "ts": { "isoString": "2099-01-08T19:30:22.700Z" }, "cart": { "coll": { "name": "Order" }, "id": "419541310615060553" }, "orders": { "after": "hdW..." }, "name": "Alice Appleseed", "email": "alice.appleseed@example.com", "address": { "street": "87856 Mendota Court", "city": "Washington", "state": "DC", "postalCode": "20220", "country": "US" } }, ... } ``` # AWS Lambda quick start Use Fauna’s [JavaScript driver](../../../build/drivers/js-client/) to query e-commerce demo data using an [AWS Lambda function](https://aws.amazon.com/pm/lambda/). 1. **Create a database with demo data** 2. **Create an authentication secret** 3. **Create the Lambda function** 1. Open the [Functions page](https://console.aws.amazon.com/lambda/home#/functions) of the Lambda console. 2. Click **Create function**. 3. Select **Author from scratch**. 4. In the **Basic information** pane, for **Function name**, enter `myFaunaLambda`. 5. For **Runtime**, click **Node.js 22.x**. 6. Leave **architecture** set to **x86\_64**. 7. Under **Additional Configurations**, select **Enable function URL**. Under **Auth type**, select **NONE**. 8. Click **Create function**. ![Create Lambda function](../../_images/lambda-create-fn.gif) Save the resulting **Function URL**. You’ll use the URL later. ![Lambda function URL](../../_images/lambda-fn-url.png) 4. **Set the FAUNA\_SECRET environment variable** 1. On the function’s page, click the **Configuration** tab, then click **Environment variables**. 2. Under **Environment variables**, click **Edit**. 3. Click **Add environment variable**. 4. Enter `FAUNA_SECRET` as the **Key**. Paste the key secret you copied earlier as the **Value**. 5. Click **Save**. ![Set FAUNA_SECRET environment variable](../../_images/set-env-var.gif) 5. **Initialize a Node.js project** In your local machine’s terminal: * Create and navigate to a project directory named `myFaunaLambda`. * Initialize a Node.js project. * Install the Fauna [JavaScript driver](../../../build/drivers/js-client/). ```bash mkdir myFaunaLambda cd myFaunaLambda npm init -y npm install fauna ``` 6. **Add code for the Lambda function** In the `myFaunaLambda` directory, create an `index.mjs` file and add the following code: ```javascript import { Client, fql, FaunaError } from "fauna"; export const handler = async (event) => { // Initialize the client to connect to Fauna // using the `FAUNA_SECRET` environment variable. const client = new Client(); try { // Compose a query using an FQL template string. // The query calls the `Product` collection's // `sortedByPriceLowToHigh()` index. It projects the `name`, // `description`, and `price` fields covered by the index. const query = fql` Product.sortedByPriceLowToHigh() { name, description, price }`; // Run the query. const pages = await client.paginate(query); // Iterate through the results. const products = []; for await (const product of pages.flatten()) { products.push(product); } return { statusCode: 200, body: JSON.stringify(products) }; } catch (error) { console.error('Fauna Query Error:', error); return { statusCode: 500, body: JSON.stringify({ error: error instanceof FaunaError ? error.message : 'Unknown error' }) }; } finally { client.close(); } }; ``` 7. **Package the function** In the `myFaunaLambda` directory, run the following command to package the function as a ZIP file: ```bash zip -r function.zip . ``` 8. **Upload the function** 1. On the Lambda function page, click the **Code** tab, then click **Upload from** and select **.zip file**. 2. Click **Upload**. 3. Select the **function.zip** file in the **myFaunaLambda** directory. 4. Click **Save**. ![Upload Lambda function](../../_images/upload-lambda-fn.gif) 9. **Test the function** To test the function, make a GET request to the Function URL you saved earlier. Using curl: ```bash curl | jq . ``` The response contains the results of the query: ``` [ { "name": "single lime", "description": "Conventional, 1 ct", "price": 35 }, { "name": "cilantro", "description": "Organic, 1 bunch", "price": 149 }, { "name": "limes", "description": "Conventional, 16 oz bag", "price": 299 }, { "name": "organic limes", "description": "Organic, 16 oz bag", "price": 349 }, { "name": "avocados", "description": "Conventional Hass, 4ct bag", "price": 399 }, { "name": "pizza", "description": "Frozen Cheese", "price": 499 }, { "name": "cups", "description": "Translucent 9 Oz, 100 ct", "price": 698 }, { "name": "taco pinata", "description": "Giant Taco Pinata", "price": 2399 }, { "name": "donkey pinata", "description": "Original Classic Donkey Pinata", "price": 2499 } ] ``` # Fauna for MongoDB users This guide outlines major differences between MongoDB and Fauna. The guide also translates common MongoDB Query Language (MQL) queries to Fauna Query Language (FQL) and Fauna Schema Language (FSL). ## [](#major-differences)Major differences The following table outlines major differences between MongoDB and Fauna. | Difference | MongoDB | Fauna | | --- | --- | --- | --- | --- | | Data model | Document database. Stores data as denormalized JSON-like documents in collections. | Combines document flexibility with native relational capabilities. Stores data as denormalized and normalized JSON-like documents in collections with relationship traversals. | | Access control | Role-based access control (RBAC). | RBAC and attribute-based access control (ABAC). Roles and privileges can be assigned dynamically based on conditions at query time. | | Administration | Uses managed clusters that require configuration and sizing for best performance and costs. | Fully managed infrastructure that scales automatically. No configuration or sizing needed. | | Connection methods | Client drivers. | Client drivers and HTTP API. | | Consistency | Strong consistency requires additional configuration and may impact performance. Distributed, consistent transactions across shards require specific configuration. | Strong consistency by default across multiple regions, continents, and clouds. No configuration needed. No performance impacts. | | Distribution | Single region by default. Multi-region are configurations available at additional costs. | Natively multi-region. Multi-region setups require no additional costs or maintenance.Supports multi-cloud setups with Virtual Private Fauna. | | Indexes | Supports single, compound, geospatial, and text search indexes.Managing indexes across multiple shards may require rolling index builds and changes to application logic. | Supports single field and compound indexes.Indexes with terms are automatically sharded (partitioned) with no operational overhead. Sharding is transparent to the client applications. | | Multi-tenancy | No native concept of multi-tenancy. Multi-tenancy is handled by the client application. | Natively multi-tenant. Uses a nested database-per-tenant model that can scale to hundreds of thousands of tenants.Tenant databases are instantly allocated and logically isolated from peers. | | Schema | Schemaless. Offers schema validation for document writes, but no native support for migrating existing documents. | Supports a progressive schema enforcement, letting you migrate from schemaless to strict enforcement of document types.Zero-downtime migrations let you update existing documents to the latest schema. Supports constraints for custom data validation rules. | | Sharding | Unsharded by default. Clusters can be configured as sharded, which results in reduced capabilities. | Natively sharded with no configuration needed. Sharding is transparent to client applications. | | Change data capture (CDC) and real-time events | Offers Change Streams to track real-time data changes.You can subscribe to changes on a single collection, a database, or across a cluster. | Offers event streams for real-time data changes and event feeds for asynchronous event pulls.You can track changes at the collection, index, document, or specific field level. | ## [](#examples)Examples The following examples compare basic operations in MongoDB and Fauna. MongoDB examples use MQL. Fauna examples use FQL and FSL. ### [](#create-and-manage-collections)Create and manage collections #### [](#create-a-schemaless-collection)Create a schemaless collection In a schemaless collection, documents can contain any field of any type. Documents in the same collection aren’t required to have the same fields. ```mql // Creates the `Product` collection. db.createCollection("Product") ``` To create a collection, create a collection schema in the database. An example FSL collection schema: ```fsl // Defines the `Product` collection. collection Product { // Wildcard constraint. // Allows arbitrary ad hoc fields of any type. *: Any // If a collection schema has no field definitions // and no wildcard constraint, it has an implicit // wildcard constraint of `*: Any`. } ``` You can create and manage schema using any of the following: * The [Fauna CLI](../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../reference/http/reference/core-api/#tag/Schema) * [FQL schema methods](../../learn/schema/manage-schema/#fql) #### [](#create-a-collection-with-predefined-fields)Create a collection with predefined fields Predefined fields let you control what fields are accepted in a collection’s documents. In MongoDB, you can control the structure of a collection’s documents using a schema validator: ```mql // Creates the `Customer` collection with a schema validator. db.createCollection("Customer", { validator: { $jsonSchema: { bsonType: "object", required: ["email", "name", "address"], properties: { email: { bsonType: "string" }, name: { bsonType: "string" }, status: { bsonType: "string", enum: ["silver", "gold", "platinum"] }, address: { bsonType: "object", required: ["street", "city", "state", "postalCode", "country"], properties: { street: { bsonType: "string" }, city: { bsonType: "string" }, state: { bsonType: "string" }, postalCode: { bsonType: "string" }, country: { bsonType: "string" } } } } } } }) ``` In Fauna, you can control the structure of a collection’s document type using its collection schema: ```fsl // Defines the `Customer` collection. collection Customer { name: String email: String status: "silver" | "gold" | "platinum"? address: { street: String city: String state: String postalCode: String country: String } *: Any } ``` #### [](#edit-a-collection)Edit a collection The following examples add an `points` field definition to the previous `Customer` collection schema. In MongoDB, you can edit a collection’s schema validator: ```mql db.runCommand({ collMod: "Customer", validator: { $jsonSchema: { bsonType: "object", required: ["email", "name", "address"], properties: { email: { bsonType: "string" }, name: { bsonType: "string" }, status: { bsonType: "string", enum: ["silver", "gold", "platinum"] }, // Adds the `points` field. Accepts `int` or `null` values. // Accepting `null` means the field is not required. points: { bsonType: "int" }, address: { bsonType: "object", required: ["street", "city", "state", "postalCode", "country"], properties: { street: { bsonType: "string" }, city: { bsonType: "string" }, state: { bsonType: "string" }, postalCode: { bsonType: "string" }, country: { bsonType: "string" } } } } } } }) ``` Schema validator changes apply only to new writes, not existing documents. Existing documents may retain invalid field values until updated. For instance, `Customer` documents with `points` as a `string` before the validator updates would keep this invalid format until updated. In Fauna, you can update a collection’s document type using a zero-downtime schema migration: ```fsl collection Customer { name: String email: String status: "silver" | "gold" | "platinum"? address: { street: String city: String state: String postalCode: String country: String } // Adds the `points` field. Accepts `int` or `null` values. // Accepting `null` means the field is not required. points: Int? // Adds the `typeConflicts` field as a catch-all field for // existing `points` values that aren't `Int` or `null`. typeConflicts: { *: Any }? *: Any migrations { // Adds the `typeConflicts` field. add .typeConflicts // Adds the `points` field. add .points // Nests non-conforming `points` and `typeConflicts` // field values in the `typeConflicts` catch-all field. move_conflicts .typeConflicts } } ``` In Fauna, schema migrations can affect existing documents. After the migration, the `points` field must be an `Int` value or not present (`null`). If a document a non-conforming `points` value, such as a `String`, before the migration, the migration nests the value in the `typeConflicts` field. #### [](#view-collections)View collections ```mql db.getCollectionNames() ``` ```fql Collection.all() ``` #### [](#delete-a-collection)Delete a collection ```mql db.Customer.drop() ``` To delete a collection, delete its schema. You can create and manage schema using any of the following: * The [Fauna CLI](../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../reference/http/reference/core-api/#tag/Schema) * [FQL schema methods](../../learn/schema/manage-schema/#fql) Deleting a collection deletes its documents and indexes. ### [](#create-and-manage-documents)Create and manage documents #### [](#create-a-document)Create a document ```mql db.Customer.insertOne({ name: "John Doe", email: "jdoe@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" }, points: 42 }) ``` ```fql Customer.create({ name: "John Doe", email: "jdoe@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" }, points: 42 }) ``` #### [](#edit-a-document)Edit a document ```mql db.Customer.updateOne( {_id: ObjectId("111")}, {$set: {email: "jdoe2@example.com"}} ) ``` ```fql Customer.byId("111")?.update({ email: "jdoe2@example.com" }) ``` #### [](#delete-a-document)Delete a document ```mql db.Customer.deleteOne({ _id: ObjectId("111") }) ``` ```fql Customer.byId("111")?.delete() ``` #### [](#perform-bulk-writes)Perform bulk writes The following examples set the `status` field to `gold` for existing `Customer` collection documents. ```mql db.Customer.updateMany( {}, { $set: { status: "gold" } } ) ``` ```fql // Uses `forEach()` to iterate through each // document in the `Customer` collection. Customer.all().forEach(doc => doc.update({ status: "gold" })) // The query doesn't return a value. ``` #### [](#view-all-collection-documents)View all collection documents ```mql db.Customer.find() ``` ```fql Customer.all() ``` ### [](#indexes-and-read-queries)Indexes and read queries #### [](#create-an-index)Create an index The following examples create an index that covers the `email` and `name` fields of the `Customer` collection. ```mql db.Customer.createIndex({ "email": 1, "name": 1 }) ``` In Fauna, you define indexes in FSL as part of a collection schema: ```fsl collection Customer { ... index byEmail { // `terms` are document fields for exact match searches. // In this example, you get `Customer` collection documents // by their `email` field value. terms [.email] // `values` are document fields for sorting and range searches. // In this example, you sort or filter index results by their // descending `name` field value. values [.name] } } ``` #### [](#exact-match-search)Exact match search ```mql // MongoDB automatically uses indexed terms, if applicable. db.Customer.find({email: "jdoe@example.com"}) ``` ```fql // Runs an unindexed query. Customer.where(.email == "jdoe@example.com") ``` For better performance on large datasets, use an index with the `email` term to run an exact match search. Define the index in the collection schema: ```fsl collection Customer { ... // Defines the `byEmail()` index for the `Customer` // collection. index byEmail { // Includes the `email` field as an index term. terms [.email] values [.name] } } ``` You call an index as a method on its collection: ```fql // Uses the `Customer` collection's `byEmail()` index // to run an exact match search on an `email` field value. Customer.byEmail("jdoe@example.com") ``` #### [](#sort-collection-documents)Sort collection documents ```mql // Sorts `Product` collection documents by: // - `price` (ascending), then ... // - `name` (ascending), then ... // - `description` (ascending), then ... // - `stock` (ascending). db.Product.find().sort({price: 1, name: 1, description: 1, stock: 1}) ``` ```fql // Runs an unindexed query. // Sorts `Product` collection documents by: // - `price` (ascending), then ... // - `name` (ascending), then ... // - `description` (ascending), then ... // - `stock` (ascending). Product.all().order(.price, .name, .description, .stock) ``` For better performance on large datasets, use an index with index values to sort collection documents. Define the index in the collection schema: ```fsl collection Product { ... // Defines the `sortedByName()` index for the `Customer` // collection. index sortedByPriceLowToHigh { values [.price, .name, .description, .stock] } } ``` Call the index in a query: ```fql // Uses the `Customer` collection's `sortedByName()` index // to sort `Customer` collection documents by: // - `price` (ascending), then ... // - `name` (ascending), then ... // - `description` (ascending), then ... // - `stock` (ascending). Product.sortedByPriceLowToHigh() ``` #### [](#range-search)Range search ```mql // Get `Customer` collection documents with an `points` between // 18 (inclusive) and 100 (inclusive). db.Customer.find({ points: { $gte: 18, $lte: 100 }}) ``` ```fql // Runs an unindexed query. // Get `Customer` collection documents with an `points` between // 18 (inclusive) and 100 (inclusive). Customer.all().where(.points >= 18 && .points <= 100) ``` For better performance on large datasets, use an index with the `points` value to run range searches. Define the index in the collection schema: ```fsl collection Customer { ... // Defines the `sortedByPoints()` index for the `Customer` // collection. index sortedByPoints { // Includes the `points` field as an index value. values [.points] } } ``` Call the index in a query: ```fql // Use the `sortedByPoints()` index to get `Customer` // collection documents with a `points` between // 18 (inclusive) and 100 (inclusive). // Also sorts results by ascending `points` value. Customer.sortedByPoints({ from: 18, to: 100 }) ``` ### [](#document-relationships)Document relationships #### [](#create-a-relationship-between-documents)Create a relationship between documents The following examples create: * A one-to-one relationship between a `Order` collection document and a `Customer` document. * A one-to-many relationship between the `Order` document and several `OrderItem` documents. * A one-to-one relationship between each `OrderItem` document and a `Product` document. In MongoDB, you model relationships between documents using embedded documents or key-based document references. MongoDB doesn’t directly support document relationships. **Embedded documents** To embed a document, you duplicate and nest a document’s data inside another document: ```mql db.Order.insertOne({ // Embeds a `Customer` document's data in the `Order` document's // `customer` field. customer: { customerId: "12345", name: "John Doe", email: "jdoe@example.com", ... }, status: "cart" // Embeds an array of `OrderItem` documents. items: [ { orderItemID: "12345", // Embeds a `Product` document's data in the `product` field. product: { productID: "111", name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698, ... }, quantity: 8 }, { orderItemID: "67890", product: { productID: "222", name: "donkey pinata", description: "Original Classic Donkey Pinata", price: 2499, ... }, quantity: 10 } ] }) ``` Embedding documents increases storage costs and can impact performance. Embedding documents can also make it difficult to keep data synced and up to date. **Key-based document references** To use a key-based document reference, you include a document’s ID as a field value. Key-based references are untyped and indistinguishable from other fields. ```mql db.Order.insertOne({ // Adds a `Customer` document's ID as the `Order` document's // `customer` field value. customer: new ObjectId(333), status: "cart", items: [ { // Adds a `OrderItem` document's ID as the `orderItem` field value. orderItem: new ObjectId(12345) }, { orderItem: new ObjectId(67890) } ] }) ``` In MongoDB, document references are not dynamically resolved on read. See [Resolve document references](#resolve-document-references). In Fauna, you can define and enforce document relationships using typed field definitions in the FSL collection schema. You can also index fields that contain document references. ```fsl collection Order { // Accepts a reference to a `Customer` collection document. customer: Ref status: "cart" | "processing" | "shipped" | "delivered" createdAt: Time // `items` contains a Set of `OrderItem` collection documents. compute items: Set = (order => OrderItem.byOrder(order)) compute total: Number = (order => order.items.fold(0, (sum, orderItem) => { if (orderItem.product != null) { sum + orderItem.product!.price * orderItem.quantity } else { sum } })) payment: { *: Any } // Defines the `byCustomer()` index. // Use the index to get `Order` documents by `customer` value. index byCustomer { terms [.customer] values [desc(.createdAt), .status] } } ``` To instantiate the relationship, include a [reference to the document](../../learn/data-model/relationships/) as a field value: ```fql // Creates an `Order` collection document. let cartOrder = Order.create({ status: "cart", // Adds a reference to the `Customer` collection // document as a field value. customer: Customer.byId("333"), createdAt: Time.now(), payment: {} }) // Creates an `OrderItem` collection document. OrderItem.create({ // Adds a reference to the `Product` collection // document as a field value. product: Product.byId("111"), quantity: 8, // Adds a reference to the previous `Order` // collection document as a field value. order: cartOrder }) OrderItem.create({ product: Product.byId("222"), quantity: 10, order: cartOrder }) ``` This approach avoids duplicating data. You can use projection to automatically resolve the document reference. See [Resolve document references](#resolve-document-references). #### [](#resolve-document-references)Resolve document references If using key-based document references in MongoDB, you must run multiple queries to resolve the references: ```mql db.Order.aggregate([ { $match: { customer: new ObjectId("12345") } }, { $lookup: { from: "Customer", localField: "customer", foreignField: "_id", as: "customer" } }, { $project: { customer: { $arrayElemAt: ["$customer", 0] }, status: 1 items: 1, } } ]) ``` ``` [ { _id: ObjectId('66846ba8c516f4578cd7f153'), // Resolves the `Customer` document in the `customer` field. customer: { _id: ObjectId('66846978c516f4578cd7f14f'), name: "John Doe", email: "jdoe@example.com", ... }, status: "cart", items: [ { // Doesn't resolve nested `OrderItem` documents. orderItem: ObjectId('668469d2c516f4578cd7f151') }, { orderItem: ObjectId('6684699fc516f4578cd7f150') } ] }, ... ] ``` These queries compound if a collection’s documents include multiple or deeply nested document references. In FQL, you dynamically resolve document relationships by projecting fields that contain document references. You can resolve multiple, deeply nested relationships in a single query: ```fql // Uses a variable to reference a `Customer` collection document. let customer = Customer.byId("12345") // Uses the `Order` collection's `byCustomer()` index to get // `Order` documents based on their `customer` value. The // previous `Customer` document is passed to the index call. let order = Order.byCustomer(customer) order { customer { name, email }, items { product { name, description, price }, quantity }, total, status } ``` The projection fetches any related documents: ``` { data: [ { // Resolves the `Customer` collection document in // the `customer` field. customer: { name: "John Doe", email: "jdoe@example.com" }, // Resolves the Set of `OrderItem` collection documents in // the `items` field. items: { data: [ { // Resolves nested `Product` documents in // `OrderItem` documents. product: { name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698 }, quantity: 2 }, { product: { name: "donkey pinata", description: "Original Classic Donkey Pinata", price: 2499 }, quantity: 1 }, { product: { name: "pizza", description: "Frozen Cheese", price: 499 }, quantity: 3 } ] }, total: 5392, status: "cart" }, ... ] } ``` # Fauna for DynamoDB users This guide outlines major differences between Amazon DynamoDB and Fauna. The guide also translates common DynamoDB queries to Fauna Query Language (FQL) and Fauna Schema Language (FSL). ## [](#major-differences)Major differences The following table outlines major differences between DynamoDB and Fauna. | Difference | DynamoDB | Fauna | | --- | --- | --- | --- | --- | | Data model | DynamoDB is a key-value datastore. Data is stored as simple key-value pairs or as documents. | Combines document flexibility with native relational capabilities. Stores data as denormalized and normalized JSON-like documents in collections with relationship traversals. | | Query capabilities | Does not support joins, subqueries, aggregations, or complex filtering. | Supports joins, subqueries, aggregations, and complex filtering through FQL. | | Indexes | Requires a primary key that can not be changed after creation. Secondary indexes are limited to a maximum of 20 per table.Global secondary indexes (GSIs) are eventually consistent. Local secondary indexes (LSIs) can limit table performance. | There is no primary key constraint. Indexes are strongly consistent. Fauna provides native support for secondary indexes without limitations found in DynamoDB, such as the ability to index nested fields and Arrays. Fauna does not limit the number of indexes per collection. | | Consistency | Eventual consistency across regions with no guarantee of data consistency at any given time. Reads may not show the latest write. | Strong consistency by default across multiple regions, continents, and cloud providers. Fauna guarantees that all reads show the latest write, even across regions. | | Distribution | Single region. Can be configured for multi-region with global tables. | Multi-region by default. | | Schema | Performance heavily depends on data distribution and common access patterns.Good performance requires a well designed schema and a deep understanding of application access patterns from the start.Adapting or evolving a schema to new access patterns can requires significant work. | Flexible schema model that allows you to adapt to new or evolving access patterns with built-in tools for zero-downtime schema migrations. FSL enables version control for schema changes, integration with CI/CD pipelines, and progressive schema enforcement as applications evolve. | | Operability | Simple Language based on the PUT and GET HTTP methods. Logic must live in application code. Doesn’t allow you to read data and use that data in a write operation in the same transaction. | Fauna uses FQL, a declarative Language that allows you to express complex data logic in a single query. FQL supports user-defined functions (UDF) to encapsulate logic and reuse it across queries. Allows for multi-line read, write, update operations in the same transaction. You can use reads as part of the writes/updates while keeping the transaction consistent. | ## [](#concepts)Concepts The following table maps common concepts from DynamoDB to equivalent in Fauna. | DynamoDB | Fauna | Notes | | --- | --- | --- | --- | --- | | Database | Database | | | Table | Collections | | | Partition Key (Simple Primary Key) | Document id | You can mimic some aspects of a primary key using a unique constraint and an index term. See Create a table. | | Partition Key and Sort Key (Composite Primary Key) | Flexible indexing system | You can create multiple indexes on a collection, allowing for various query patterns without being constrained to a single primary key structure. | | Index | Index | Fauna indexes must be named. This encourages better readability and more predictable behavior. | ## [](#examples)Examples The following examples compare basic operations in DynamoDB and Fauna. DynamoDB examples use the AWS CLI. Fauna examples use FQL and FSL. ### [](#create-and-manage-tables-or-collections)Create and manage tables or collections #### [](#create-a-table)Create a table Create a table named `Product` with a primary key `id`. ```bash aws dynamodb create-table \ --table-name Product \ --attribute-definitions AttributeName=id,AttributeType=S \ --key-schema AttributeName=id,KeyType=HASH \ --provisioned-throughput ReadCapacityUnits=5,WriteCapacityUnits=5 \ --region us-west-2 ``` To create a collection, create a collection schema in the database. An example FSL collection schema: ```fsl // Defines the `Product` collection. collection Product { // Wildcard constraint. // Allows arbitrary ad hoc fields of any type. *: Any // If a collection schema has no field definitions // and no wildcard constraint, it has an implicit // wildcard constraint of `*: Any`. } ``` You can create and manage schema using any of the following: * The [Fauna CLI](../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../reference/http/reference/core-api/#tag/Schema) * [FQL schema methods](../../learn/schema/manage-schema/#fql) #### [](#list-all-tables)List all tables ```bash aws dynamodb list-tables ``` ```fql Collection.all() ``` #### [](#delete-a-table)Delete a table ```bash aws dynamodb delete-table --table-name Customer ``` To delete a collection, delete its schema using the Dashboard or Fauna CLI. Deleting a collection deletes its documents and indexes. ### [](#create-and-manage-documents)Create and manage documents #### [](#create-a-document)Create a document ```bash aws dynamodb put-item \ --table-name Customer \ --item '{ "name": {"S": "John Doe"}, "email": {"S": "jdoe@example.com"}, "address": { "M": { "street": {"S": "87856 Mendota Court"}, "city": {"S": "Washington"}, "state": {"S": "DC"}, "postalCode": {"S": "20220"}, "country": {"S": "US"} } } }' ``` ```fql Customer.create({ name: "John Doe", email: "jdoe@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } }) ``` #### [](#edit-a-document)Edit a document The following example updates the `name` attribute of a document. In this example, the `email` attribute is the primary key. ```bash aws dynamodb update-item \ --table-name Customer \ --key '{ "email": {"S": "jdoe@example.com"} }' \ --update-expression "SET #nm = :name" \ --expression-attribute-names '{ "#nm": "name" }' \ --expression-attribute-values '{ ":name": {"S": "Jane Doe"} }' ``` The following example gets the first document with a specific email address and updates its `name` field value. ```fql Customer.firstWhere(.email == "jdoe@example.com")?.update({ name: "Jane Doe" }) ``` For better performance on large datasets, use an index with the `email` term to run an exact match search. Define the index in the collection schema: ```fsl collection Customer { ... // Defines the `byEmail()` index for the `Customer` // collection. index byEmail { // Includes the `email` field as an index term. terms [.email] values [.name] } } ``` You call an index as a method on its collection: ```fql // Uses the `Customer` collection's `byEmail()` index // to run an exact match search on an `email` field value. Customer.byEmail("jdoe@example.com").first()?.update({ name: "Jane Doe" }) ``` #### [](#update-nested-document-fields)Update nested document fields ```bash aws dynamodb update-item \ --table-name Customer \ --key '{ "email": {"S": "jdoe@example.com"} }' \ --update-expression "SET address.#city = :city" \ --expression-attribute-names '{ "#city": "city" }' \ --expression-attribute-values '{ ":city": {"S": "Newtown"} }' ``` ```fql Customer.byEmail("jdoe@example.com").first()?.update({ address: { street: "87856 Mendota Court", city: "Newtown", state: "DC", postalCode: "20220", country: "US" } }) ``` #### [](#delete-a-document)Delete a document ```bash aws dynamodb delete-item \ --table-name Customer \ --key '{ "email": {"S": "jdoe@example.com"} }' ``` ```fql Customer.byEmail("jdoe@example.com").first()?.delete() ``` #### [](#perform-bulk-writes)Perform bulk writes The following examples set the `status` field to `gold` for existing `Customer` collection documents. First, retrieve the existing items from the `Customer` table. ```bash aws dynamodb scan --table-name Customer --attributes-to-get "email" > customers.json ``` Next, prepare a JSON file named batch-write.json with the items you want to update. ```json { "Customer": [ { "PutRequest": { "Item": { "email": { "S": "jdoe@example.com" }, "address": { "M": { "state": { "S": "District of Columbia" } } } } } }, { "PutRequest": { "Item": { "email": { "S": "bob.brown@example.com" }, "address": { "M": { "state": { "S": "District of Columbia" } } } } } }, { "PutRequest": { "Item": { "email": { "S": "carol.clark@example.com" }, "address": { "M": { "state": { "S": "District of Columbia" } } } } } } ] } ``` ```bash aws dynamodb batch-write-item --request-items file://batch-write.json ``` ```fql // Get a Set of `Customer` collection documents with an // `address` in the `state` of `DC`. let customers = Customer.where(.address?.state == "DC") // Use `forEach()` to update each document in the previous Set. customers.forEach(doc => doc.update({ address: { street: doc?.address?.street, city: doc?.address?.city, state: "District of Columbia", postalCode: doc?.address?.postalCode, country: doc?.address?.country, } })) // `forEach()` returns `null`. ``` ### [](#indexes-and-read-queries)Indexes and read queries #### [](#create-an-index)Create an index Run the following command to create a GSI named `byEmail()` on the `Customer` table. ```bash aws dynamodb update-table \ --table-name Customer \ --attribute-definitions AttributeName=Email,AttributeType=S \ --global-secondary-index-updates \ "[{\"Create\":{\"IndexName\": \"byEmail\",\"KeySchema\":[{\"AttributeName\":\"Email\",\"KeyType\":\"HASH\"}],\"Projection\":{\"ProjectionType\":\"ALL\"},\"ProvisionedThroughput\":{\"ReadCapacityUnits\":5,\"WriteCapacityUnits\":5}}}]" ``` If using global tables, you must run this separately in each region. GSIs don’t automatically replicate across regions. DynamoDB can only do this on top-level attributes. If there is a nested attribute, it cannot index those. In Fauna, you define and manages indexes in FSL as part of a collection schema: ```fsl collection Customer { ... index byEmail { // `terms` are document fields for exact match searches. // In this example, you get `Customer` collection documents // by their `email` field value. terms [.email] // `values` are document fields for sorting and range searches. // In this example, you sort or filter index results by their // descending `firstName` and `lastName` field values. values [.firstName, .lastName] } } ``` Fauna indexes are globally replicated automatically across all regions. You don’t need to create indexes in each region. You can also index nested fields. #### [](#exact-match-search)Exact match search ```bash aws dynamodb query \ --table-name Customer \ --index-name byEmail \ --key-condition-expression "Email = :email" \ --expression-attribute-values '{":email":{"S":"user@example.com"}}' ``` ```fql // Runs an unindexed query. Customer.where(.email == "jdoe@example.com") ``` For better performance on large datasets, use an index with the `email` term to run an exact match search. You call an index as a method on its collection: ```fql // Uses the `Customer` collection's `byEmail()` index // to run an exact match search on an `email` field value. Customer.byEmail("jdoe@example.com") ``` #### [](#sort-collection-documents)Sort collection documents Find all products that has a `name` of `pizza` and sort them by ascending `price`. Make sure your DynamoDB table has a partition key `name` and sort key `price`. ```bash aws dynamodb create-table \ --table-name Product \ --attribute-definitions \ AttributeName=name,AttributeType=S \ AttributeName=price,AttributeType=N \ --key-schema \ AttributeName=name,KeyType=HASH \ AttributeName=price,KeyType=RANGE \ --provisioned-throughput \ ReadCapacityUnits=5,WriteCapacityUnits=5 ``` ```bash aws dynamodb query \ --table-name Product \ --key-condition-expression "#name = :name" \ --expression-attribute-names '{"#name": "name"}' \ --expression-attribute-values '{":name": {"S": "pizza"}}' \ --no-scan-index-forward \ --output json ``` ```fql // Runs an unindexed query. // Returns a Set of `Product` collection documents with // a `name` of `pizza`. Sorts the documents by: // - `price` (ascending) Product.where(.name == 'pizza').order(.price) ``` For better performance on large datasets, use an index with values to sort collection documents. ```fsl collection Product { ... // Defines the `byName()` index for the `Product` collection. index byName { // Includes the `name` field as an index term. // Use the term to run exact match searches. terms [.name] // Includes `price` as an index value. // Sorts the documents by: // - `price` (ascending) values [.price] } } ``` Call the index in a query: ```fql // Gets a Set of products with a `name` of `pizza`. // The Set is sorted by `price` (ascending). Product.byName("pizza") ``` # Fauna for SQL users This guide outlines major differences between traditional relational databases (RDBs) and Fauna. It also: * Maps traditional RDB concepts to Fauna * Translates common SQL queries to Fauna Query Language (FQL). ## [](#major-differences)Major differences The following table outlines major differences between traditional RDBs and Fauna. | Difference | Traditional RDB | Fauna | | --- | --- | --- | --- | --- | | Data model | Stores data in tables with rows and columns. | Stores data as JSON documents in collections. | | Schema | Requires upfront schema definition with a fixed data type for each column. | Flexible schema model with optional field definitions and constraints. Migrations require zero downtime. | | Relationships | Uses foreign keys and joins to connect data across tables. | Uses document references to create relationships between documents in different collections. | | Query language | Uses SQL, which relies on various commands. | Uses FQL, which has a Typescript-like syntax and relies on methods. | | Data definition | Uses SQL to create databases, create tables, and alter tables. | Uses Fauna Schema Language (FSL) to define, create, and update collections as schema.You can create and manage schema using any of the following:The Fauna CLIThe Fauna DashboardThe Fauna Core HTTP API’s Schema endpointsFQL schema methods | ## [](#concepts)Concepts The following table maps common concepts from traditional RDBs to their equivalent in Fauna. | SQL | Fauna | Notes | | --- | --- | --- | --- | --- | | Record / Row | Document | | | Column | Document fields | | | Table | Collection | | | Database | Database | | | Primary key | Document id | You can mimic some aspects of a primary key using a unique constraint and an index term. See Create a table. | | Index / Materialized Views | Index | Fauna indexes must be named. This encourages better readability and more predictable behavior. | | Foreign key | Document reference | | | Stored procedure | User-defined function | | | Transactions | Transactions | | ## [](#examples)Examples The following examples compare basic operations in SQL and Fauna. The SQL examples use the `dept` (departments) and `emp` (employees) tables: ```sql SQL> DESC dept Name Null? Type ----------------------------------------- -------- DEPTNO NOT NULL NUMBER(2) DNAME VARCHAR2(14) LOC VARCHAR2(13) ZIP NUMBER ``` ```sql SQL> DESC emp Name Null? Type ----------------------------------------- -------- EMPNO NOT NULL NUMBER(4) ENAME VARCHAR2(10) JOB VARCHAR2(9) MGR NUMBER(4) HIREDATE DATE SAL NUMBER(7,2) COMM NUMBER(7,2) DEPTNO NUMBER(2) ``` The Fauna examples use the corresponding `Dept` and `Emp` collections. The collections use the following schema: ```fsl collection Dept { deptno: Number dname: String? loc: String? zip: Number? unique [.deptno] unique [.dname] index byDeptNo { terms [.deptno] } index byDeptName { terms [.dname] values [.deptno] } index sortedByDeptNoLowToHigh { values [.deptno, .dname, .zip] } } ``` ```fsl collection Emp { empno: Number ename: String? job: String? mgr: Number? hiredate: Date? sal: Number? comm: Number? deptno: Number? index byDeptNo { terms [.deptno] } index sortedBySalaryLowToHigh { values [.sal, .deptno] } index sortedBySalaryHighToLow { values [desc(.sal), .deptno] } } ``` ### [](#create-and-alter)Create and alter This section covers common data definition operations in SQL and Fauna. #### [](#create-a-database)Create a database **CREATE DATABASE** ```sql CREATE DATABASE employees; ``` **Create a database** Fauna is multi-tenant. You can create a parent database with one or more nested child databases. You can create top-level databases in the [Fauna Dashboard](https://dashboard.fauna.com/) or with the [Fauna CLI](../../build/cli/v4/). You can then run a query to create a child database: ```fql Database.create({ name: "employees" }) ``` #### [](#create-a-table)Create a table **CREATE TABLE** ```sql CREATE TABLE dept( deptno NUMBER(2,0), dname VARCHAR2(14), loc VARCHAR2(13), CONSTRAINT pk_dept PRIMARY KEY (deptno) ); ``` **Create a collection** To create a collection, create a collection schema in the database. You can create and manage schema using any of the following: * The [Fauna CLI](../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../reference/http/reference/core-api/#tag/Schema) * [FQL schema methods](../../learn/schema/manage-schema/#fql) For example, the `Dept` collection has the following schema: ```fsl collection Dept { deptno: Number dname: String? loc: String? ... unique [.deptno] ... index byDeptNo { terms [.deptno] } ... } ``` The collection’s `deptno` field mimics some aspects of a primary key: * The `deptno` field is required in incoming documents. * Each document in the collection must have a unique `deptno` value. * You can use the `byDeptNo()` index to fetch documents based on `deptno`. Fauna can’t directly require fields in collections or enforce field data types. #### [](#add-a-column)Add a column **ALTER TABLE: ADD COLUMN** ```sql ALTER TABLE dept ADD (zip NUMBER); ``` **Add a field definition** Fauna collections are schemaless by default. Documents can contain any field of any type. Documents in the same collection aren’t required to have the same fields. You can enforce a document structure by adding [field definitions](../../reference/fsl/field-definitions/) to the collection schema: ```fsl collection Dept { zip: Number? ... } ``` #### [](#truncate-a-table)Truncate a table **TRUNCATE TABLE** In SQL, `truncate` removes all records, but preserves the structure of the table. ```sql TRUNCATE TABLE dept; ``` **Delete and recreate a collection** In FQL, the equivalent is to delete and recreate the collection with the same schema. See [Drop a table](#drop-a-table) and [Create a table](#create-a-table). #### [](#drop-a-table)Drop a table **DROP TABLE** ```sql DROP TABLE dept; ``` **Delete a collection** To delete a collection, delete its schema using the Dashboard or the [Fauna CLI](../../build/cli/v4/)'s [`fauna schema push`](../../build/cli/v4/commands/schema/push/) command. Deleting a collection deletes its documents and indexes. ### [](#insert-update-and-delete)Insert, update, and delete This section covers common data manipulation operations in SQL and Fauna. #### [](#insert-a-record)Insert a record **INSERT** ```sql INSERT INTO dept (deptno, dname, loc) VALUES (10, "ACCOUNTING", "NEW YORK"); ``` **Create a document** ```fql Dept.create({ deptno: 10, dname: "ACCOUNTING", loc: "NEW YORK" }) ``` #### [](#update-a-record)Update a record **UPDATE** ```sql UPDATE dept SET loc = "AUSTIN" WHERE deptno = 10; ``` **Update a document** ```fql Dept.where(.deptno == 10).first() ?.update({ loc: "AUSTIN" }) ``` [`collection.where()`](../../reference/fql-api/collection/instance-where/) requires a scan of the entire collection and isn’t performant on large collections. For better performance, use an index with the `deptno` term to run an exact match search: ```fql Dept.byDeptNo(10).first() ?.update({ loc: "AUSTIN" }) ``` The query uses method chaining to: * Call the `Dept` collection’s `byDeptNo()` index to get a [Set](../../reference/fql/types/#set) of documents with a `deptno` of `10`. `depto` is the only term for the index. Because of the collection’s [unique constraint](../../reference/fsl/unique/) on `deptno`, the Set only contains one document. * Call [`set.first()`](../../reference/fql-api/set/first/) to get the first (and only) document from the Set. * Call [`document.update()`](../../reference/fql-api/document/update/) to update the document. Indexes store, or cover, their terms and values for quicker retrieval than a collection scan. #### [](#delete-a-record)Delete a record **DELETE** ```sql DELETE FROM dept WHERE deptno = 10; ``` **Delete a document** ```fql Dept.where(.deptno == 10).first() ?.delete() ``` For better performance, use an index with the `deptno` term instead: ```fql Dept.byDeptNo(10).first() ?.delete() ``` ### [](#select)Select This section covers common read operations in SQL and Fauna. #### [](#select-all-records)Select all records **SELECT: ALL ROWS** ```sql SELECT * FROM dept; ``` **Get all documents** ```fql Dept.all() ``` Like [`collection.where()`](../../reference/fql-api/collection/instance-where/), [`collection.all()`](../../reference/fql-api/collection/instance-all/) requires a scan of the entire collection. It isn’t performant on large collections. Instead, use an index and [projection](../../reference/fql/projection/) to only get the specific fields you need: ```fql Dept.sortedByDeptNoLowToHigh() { dname, loc } ``` `dname` and `loc` are values of the `sortedByDeptNoLowToHigh()` index. The above query is covered: It can fetch `dname` and `loc` values without scanning the entire `Dept` collection. #### [](#select-based-on-a-single-parameter)Select based on a single parameter **SELECT with WHERE** ```sql SELECT * FROM dept WHERE deptno = 10; ``` **Exact match search with an index term** ```fql Dept.where(.deptno == 10) ``` For better performance, use an index with the `deptno` term to run an exact match search: ```fql Dept.byDeptNo(10) ``` #### [](#select-using-inequality)Select using inequality **SELECT with an inequality comparison** ```sql SELECT * FROM dept WHERE deptno != 10; f ``` **Covered query with `.where()`** ```fql Dept.where(.deptno != 10) ``` For better performance, use an index that includes `deptno` as an index value: ```fql Dept.sortedByDeptNoLowToHigh() .where(.deptno != 10) ``` The query uses [`set.where()`](../../reference/fql-api/set/where/) to filter the Set returned by the index. The query is more performant because it: * Applies `where()` to a smaller Set of documents * Filters off a covered index value, `deptno` #### [](#select-based-on-a-list)Select based on a list **SELECT with IN** ```sql SELECT * FROM dept WHERE deptno IN (10,11,12) ``` **Get documents using `.map()`** ```fql // Convert Array to Set let deptNums = [10, 11, 12].toSet() // Iterate through the Set deptNums.map((deptno) => { // Get a `Dept` document for each dept num Dept.byDeptNo(deptno).first() }) ``` FQL provides several methods for iterating over a Set. [`set.forEach()`](../../reference/fql-api/set/foreach/), [`set.map()`](../../reference/fql-api/set/map/), [`set.flatMap()`](../../reference/fql-api/set/flatmap/) are similar but used for different purposes: | Method | Primary use | Notes | | --- | --- | --- | --- | --- | | set.forEach() | Perform in-place writes on Set elements. | Doesn’t return a value. | | set.map() | Returns a new Set. | Can’t perform writes. | | set.flatMap() | Similar to set.map(), but flattens the resulting Set by one level. | Can’t perform writes. | If `deptno` values weren’t unique, you could use `flatMap()` to flatten the resulting nested Set: ```fql // Convert Array to Set let deptNums = [10, 11, 12].toSet() // Iterate through the Set and // flatten the resulting Set by one level deptNums.flatMap((deptno) => { Dept.byDeptNo(deptno) }) ``` #### [](#select-by-id)Select by ID **SELECT: Based on a row id** ```sql SELECT * FROM emp WHERE id = 2349879823 ``` **Get a document with `byId`** ```fql Emp.byId("395238614905126976") ``` Use [`collection.byId()`](../../reference/fql-api/collection/instance-byid/) to get a document by its `id`. #### [](#range-search)Range search **SELECT with a range condition** ```sql SELECT * FROM emp WHERE sal >= 20000 ``` **Ranged search with an index value** ```fql Emp.where(.sal >= 20000) ``` For better performance, use an index with the `sal` value to run a ranged search: ```fql Emp.sortedBySalaryLowToHigh({ from: 20000 }) ``` `sal` is the first value of the `sortedBySalaryLowToHigh()` index. #### [](#group-by)Group by **SELECT with GROUP BY** Query to select the maximum salary by department ```sql SELECT MAX(sal), deptno FROM emp GROUP BY deptno; ``` **Get grouped documents** ```fql // Get Set of department numbers let deptNums = Dept.sortedByDeptNoLowToHigh() { deptno } // Get the first employee for each department number. // Employees are sorted by salary from high to low. deptNums.map((deptNum) => { Emp.sortedBySalaryHighToLow().firstWhere( .deptno == deptNum.deptno ) { sal, deptno } }) ``` You can also use the built-in FQL [`array.fold()`](../../reference/fql-api/array/fold/) and [`set.fold()`](../../reference/fql-api/set/fold/) methods to create a `groupBy()` [FQL function](../../reference/fql/functions/) or a [user-defined function (UDF)](../../learn/schema/user-defined-functions/) that outputs the same results as `GROUP BY`. See [Group By: Aggregate data in Fauna](../../learn/query/patterns/group-by/). #### [](#joins)Joins **EQUI-JOIN two tables** ```sql SELECT e.* FROM emp e, dept d WHERE e.deptno = d.deptno AND d.dname = "SALES"; ``` **Get documents based on data from another collection** ```fql // Get the `deptno` for the "Sales" department let salesDeptNo = Dept.byDeptName("Sales").first() { deptno } // Get employees with a matching `deptno` Emp.byDeptNo(salesDeptNo?.deptno) ``` Instead of using a foreign key, such as `deptno`, you can directly reference documents in other collections. For example, update the `Emp` collection schema to include: * A nullable `dept` field definition that accepts `Dept` collection documents * A migration statement for the new `dept` field * A `byDept()` index definition that lets you get `Emp` collection documents by `dept` value ```fsl collection Emp { ... // Field definition for the `dept` field. Accepts a reference // to a `Dept` collection document or `null` (not present). dept: Ref? // Migrations block. Contains an `add` migration statement // for the new `dept` field. migrations { add .dept } // Defines the `byDept()` index. Use the index to get // `Emp` documents by their `dept` field value. index byDept { terms [.dept] } ... } ``` Then create an `Emp` document with a `dept` field. The field references a document in the `Dept` collection: ```fql // Get "Sales" dept document let salesDept = Dept.byDeptName("Sales").first() Emp.create({ empno: 123456, ename: "John Doe", sal: 2000, // Create a reference to "Sales" dept document dept: salesDept }) ``` Use the `byDept()` index to get `Emp` documents with a `dept` field that references a specific `Dept` collection document: ```fql // Get "Sales" dept document let salesDept = Dept.byDeptName("Sales").first() // Get `Emp` documents that // reference the "Sales" dept document Emp.byDept(salesDept) { ename, sal, dept } ``` Use [projection](../../reference/fql/projection/) to resolve the reference in results. This is similar to performing a join. ## [](#set-operations)Set operations You can use [set instance methods](../../reference/fql-api/set/#instance-methods) to perform SQL-like [set operations](https://en.wikipedia.org/wiki/Set_operations_\(SQL\)), such as unions, joins, and intersections, in FQL. For examples, see [Work with multiple Sets](../../learn/query/patterns/sets/). # Data model This section of the Fauna documentation describes the architectural aspects of data management and modeling principles, the data model schema naming and taxonomy, and the components of the model. ## [](#in-this-section)In this section [Databases and multi-tenancy](databases/) This section defines the concept of a Fauna database. [Collections](collections/) This section describes collections as a way to group documents in a database. [Sets](sets/) This section describes Sets as a grouping of objects that is a subset of a collection, including Set characteristics and the methods that operate on Sets. [User-defined functions (UDFs)](../schema/user-defined-functions/) This section describes functions defined by an FQL user to query and update a database, and can be called by name. [Data modeling best practices](best-practices/) Get best practices for modeling your data in Fauna. # Databases and multi-tenancy In Fauna, a database stores data as [documents](../documents/) in one or more [collections](../collections/). ## [](#model)Database model Fauna’s database model makes it easy to create databases for isolated environments, such as staging and production, and multi-tenant applications: * Each Fauna database can have many child databases. You can use child databases as tenants for your application. See [Multi-tenancy](#multi-tenancy). * All databases, including child databases, are instantly allocated without provisioning or warmup. * Each database is logically isolated from its peers with separate access controls. See [Isolation and access control](#isolation-access-control). * All Fauna resources, except [top-level keys](../../security/keys/#key-scope), exist as documents within a specific database. This includes [collections](../collections/), [user-defined functions](../../schema/user-defined-functions/), and [child databases](#child). * Queries run in the context of a single database and can’t access data outside the database. See [Scope and routing](#scope-routing). ## [](#multi-tenancy)Multi-tenancy Fauna databases support a hierarchical database structure with top-level and child databases. ### [](#child)Top-level and child databases Top-level databases exist in an account’s top-level context. All Fauna resources, except [top-level keys](../../security/keys/#key-scope), exist as documents within a specific database. This includes child databases. A database can have many child databases, which can also have child databases. ## [](#isolation-access-control)Isolation and access control You can use Fauna databases to build applications with strong isolation guarantees: * Each database is logically isolated from its peers with separate access controls. * You can use [scoped keys](#scoped-keys) from a parent database to manage and access data in child databases. * Child databases can’t access or discover parent or peer databases. ## [](#scope-routing)Scope and routing Each Fauna query is an independently authenticated request to the [Query HTTP API endpoint](../../../reference/http/reference/core-api/#operation/query). Each query is a transaction. Queries run in the context of a single database and can’t access data outside the database. Queries are routed to a database based on the Query API request’s [authentication secret](../../security/authentication/#secrets). You can’t use a secret to access a peer or parent database. ### [](#scoped-keys)Query child databases using scoped keys A [scoped key](../../security/keys/#scoped-keys) lets you use a parent database’s admin key to send query requests to its child databases. For example, if you have an admin key for a parent database and want to connect to a child database named `childDB`, you can create a scoped key using the following format: ``` // Scoped key that impersonates an `admin` key for // the `childDB` child database. fn...:childDB:admin ``` | See Scoped keys | | --- | --- | --- | ## [](#sys-coll)`Database` collection Fauna stores metadata and settings for a database’s child databases as documents in the [`Database`](../../../reference/fql-api/database/) system collection. These documents have the [DatabaseDef](../../../reference/fql/types/#databasedef) type. You can use [`Database` collection methods](../../../reference/fql-api/database/) to create and manage databases in FQL. The `Database` collection only contains direct child databases of the database scoped to your authentication secret. You can’t use the `Database` collection to access parent, peer, or other descendant databases. | See Database FQL docs | | --- | --- | --- | ### [](#global-id)Global database ID Each `Database` document contains an auto-generated, globally unique ID for the database: ``` { name: "ECommerce", coll: Database, ts: Time("2099-06-24T21:54:38.890Z"), // Globally unique id for the `ECommerce` database. global_id: "ysjpykbahyyr1", priority: 10, typechecked: true } ``` Applications and external systems can also use this ID to identify a Fauna database. For example, Fauna [access providers](../../security/access-providers/) use a database’s globally unique ID in the audience URL, which must be included in the `aud` (audience) claim of JWTs issued by the provider. Fauna uses this URL to authenticate query requests and route them to a specific database. ## [](#create-manage)Create and manage databases You can create and manage databases using: * The [Fauna Dashboard](https://dashboard.fauna.com/) * The [Fauna CLI](../../../build/cli/v4/) * [FQL `Database` methods](../../../reference/fql-api/database/) ### [](#create)Create a database You can create a database using the [Fauna Dashboard](https://dashboard.fauna.com/) or the [Fauna CLI](../../../build/cli/v4/)'s [`fauna database create`](../../../build/cli/v4/commands/database/create/) command: ```cli # Create a top-level 'my_db' database # in the 'us' region group. fauna database create \ --name my_db \ --database us # Create a child database named 'child_db' # directly under 'us/parent_db'. fauna database create \ --name child_db \ --database us/parent_db ``` You can use [`Database.create()`](../../../reference/fql-api/database/static-create/) to programmatically create child databases database using an FQL query: ```fql Database.create({ name: "ECommerce", typechecked: true }) ``` Using FQL to create or manage top-level databases is not supported. ### [](#manage-a-databases-schema)Manage a database’s schema You can use [schema](../../schema/) to control a database’s structure and behavior. You manage schema using the [Fauna Dashboard](https://dashboard.fauna.com/) or as `.fsl` files using the [Fauna CLI](../../../build/cli/v4/). For example, the following [`fauna schema push`](../../../build/cli/v4/commands/schema/push/) command [stages a schema change](../../schema/manage-schema/#staged) using the CLI: ```cli fauna schema push \ --database us/my_db ``` You can also [create a CI/CD pipeline](../../schema/manage-schema/#cicd) to copy and deploy schema across databases. | See Schema | | --- | --- | --- | ### [](#manage-schema-for-child-databases)Manage schema for child databases You can manage schema for child databases using: * The [Fauna CLI](#cli) * [FQL schema methods](#fql) * The [Schema HTTP API](#http) #### [](#cli)Use the Fauna CLI The Fauna CLI’s [`fauna schema`](../../../build/cli/v4/commands/schema/) commands let you specify a `--database` when you use [interactive login](../../../build/cli/v4/#interactive) or an [account key](../../../build/cli/v4/#account-key) for authentication. You can use `--database` to interact with any child database the related account key has access to. ```cli # Stage schema changes for the # 'us/parent_db/child_db' database. fauna schema push \ --database us/parent_db/child_db \ --dir /path/to/schema/dir ``` Alternatively, you can use `--secret` to provide a [scoped key](../../security/keys/#scoped-keys). A scoped key lets you interact with a child database’s schema using a parent database’s admin key. For example, with a parent database’s admin key secret, you can access a child database by appending the child database name and role: ```cli # Use a scoped key from a parent database # to stage schema in the 'child_db' child database. # The scoped key has `admin` privileges. fauna schema push \ --secret fn123:child_db:admin \ --dir /path/to/schema/dir ``` #### [](#fql)Use FQL schema methods Fauna stores each schema for a database as an FQL document in a related [system collection](../collections/#system-coll). You can use methods for these system collections to programmatically manage the schema of child databases using FQL queries. Use a [scoped key](../../security/keys/#scoped-keys) to manage a child database’s schema using queries in a parent database. | FSL schema | FQL system collection | | --- | --- | --- | --- | | Access provider schema | AccessProvider collection | | Collection schema | Collection collection | | Function schema | Function collection | | Role schema | Role collection | #### [](#http)Use Schema HTTP API endpoints You can use the Fauna Core HTTP API’s [Schema endpoints](../../../reference/http/reference/core-api/#tag/Schema) to perform programmatically perform schema changes, including [staged schema changes](../../schema/manage-schema/#staged). Use a [scoped key](../../security/keys/#scoped-keys) to manage a child database’s schema using a parent database’s key secret. For example, the following [Update schema files](../../../reference/http/reference/core-api/#operation/update) request uses a scoped key that impersonates a key with the `admin` role for the `childDb` child database. The request starts a staged schema change for the `childDb` child database: ```bash curl -X POST "https://db.fauna.com/schema/1/update?staged=true" \ -H "Authorization: Bearer $FAUNA_SECRET:childDb:admin" \ -H "Content-Type: multipart/form-data" \ -F "collections.fsl=@./schema/collections.fsl" \ -F "functions.fsl=@./schema/functions.fsl" ``` ### [](#rename)Rename a database You can rename a database using the [Fauna Dashboard](https://dashboard.fauna.com/). You can also use [`database.update()`](../../../reference/fql-api/database/update/) to rename a child database in an FQL query: ```fql // Renames the `ECommerce` database to `ECommerceStore`. Database.byName("ECommerce")!.update({ name: "ECommerceStore" }) ``` Renaming a database preserves any inbound references to the database. Data in a renamed database remains accessible using existing keys. ### [](#delete)Delete a database You can delete a database using the [Fauna Dashboard](https://dashboard.fauna.com/) or the [Fauna CLI](../../../build/cli/v4/)'s [`fauna database delete`](../../../build/cli/v4/commands/database/delete/) command: ```cli # Delete the top-level 'my_db' database # in the 'us' region group. fauna database delete \ --name my_db \ --database us # Delete a database named 'child_db' directly # under 'us/parent_db'. fauna database delete \ --name child_db \ --database us/parent_db ``` You can also use [`database.delete()`](../../../reference/fql-api/database/delete/) to delete a child database in an FQL query: ```fql // Deletes the `ECommerce` database. Database.byName("ECommerce")!.delete() ``` #### [](#delete-cons)Considerations When you delete a database, its data becomes inaccessible and is asynchronously deleted. As part of the deletion process, Fauna recursively deletes: * Any keys scoped to the database. * The database’s child databases, including any nested databases. Deleting a database with a large number of keys can exceed Transactional Write Ops throughput limits. This can cause [throttling errors](../../../reference/http/reference/errors/#rate-limits) with a `limit_exceeded` [error code](../../../reference/http/reference/errors/#error-codes) and a 429 HTTP status code. Deleting a database with a large number of child databases can cause timeout errors with a `time_out` [error code](../../../reference/http/reference/errors/#error-codes) and a 440 HTTP status code. To avoid throttling or timeouts, incrementally delete all keys and child databases before deleting the database. See [delete all keys](../../../reference/fql-api/key/delete/#delete-all-keys) and [delete all child databases](../../../reference/fql-api/database/delete/#delete-all-dbs). # Collections You add data to Fauna as JSON-like objects called [documents](../documents/). Documents are stored in collections, which group related data. ## [](#coll-types)Collection types Fauna has two types of collections: * [User-defined collections](#user-defined-coll) * [System collections](#system-coll) ### [](#user-defined-coll)User-defined collections A user-defined collection stores application data you’ve added to a Fauna database. For example, a database for an e-commerce application may have a `Product` collection to store product-related data. A database can have zero or more user-defined collections. A user-defined collection can have any number of documents. ### [](#system-coll)System collections A system collection stores built-in Fauna resources. For example, Fauna stores [credentials](../../security/tokens/#credentials) as documents in the `Credential` system collection. You can use [Credential](../../../reference/fql-api/credential/) methods to access `Credential` collection documents in FQL. System collections include: * [`AccessProvider`](../../../reference/fql-api/accessprovider/) * [`Collection`](../../../reference/fql-api/collection/) * [`Credential`](../../../reference/fql-api/credential/) * [`Database`](../../../reference/fql-api/database/) * [`Function`](../../../reference/fql-api/function/) * [`Key`](../../../reference/fql-api/key/) * [`Role`](../../../reference/fql-api/role/) * [`Token`](../../../reference/fql-api/token/) #### [](#named-coll)Named collections A named collection is a subset of system collections whose documents are uniquely identified using names instead of document IDs. Named collections include: * [`AccessProvider`](../../../reference/fql-api/accessprovider/) * [`Collection`](../../../reference/fql-api/collection/) * [`Database`](../../../reference/fql-api/database/) * [`Function`](../../../reference/fql-api/function/) * [`Role`](../../../reference/fql-api/role/) #### [](#system-collection-limitations)Limitations You can create and manage system collection documents, but you can’t create, change, or delete a system collection itself. You can’t change the [collection schema](#collection-schema) of a system collection. ## [](#collection-schema)Collection schema You create and manage user-defined collections as FSL collection schema. Each user-defined collection has a collection schema. The schema defines the structure and behavior of a collection and its documents. | See Collection schema | | --- | --- | --- | ## [](#collection-system-coll)`Collection` collection Fauna stores the schema for user-defined collections as documents in the `Collection` system collection. You can use the `Collection` collection’s [static methods](../../../reference/fql-api/collection/#static-methods) to access collection schema in FQL. | Reference: Static Collection methods | | --- | --- | --- | ## [](#create-access-coll-docs)Create and access collection documents Collections names act as top-level objects in FQL queries. You can use [collection name methods](../../../reference/fql-api/collection/#name-methods) to create and access collection documents. For example, the following query uses [`collection.create()`](../../../reference/fql-api/collection/instance-create/) to create a document in the `Customer` collection: ```fql // Creates a `Customer` collection document. Customer.create({ name: "John Doe", email: "jdoe@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } }) ``` The query returns the document and includes the document ID: ``` { id: "12345", coll: Customer, ts: Time("2099-07-10T15:41:49.945Z"), cart: null, orders: "hdW...", name: "John Doe", email: "jdoe@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } } ``` You can use [`collection.byId()`](../../../reference/fql-api/collection/instance-byid/) to get a document by its ID: ```fql // Gets a `Customer` collection document by ID. Customer.byId("12345") ``` You can chain [document instance methods](../../../reference/fql-api/document/#instance-methods) to a document to update, replace, or delete the document. For example, the following query uses [`document.update()`](../../../reference/fql-api/document/update/) to update a document: ```fql // Updates the `Customer` collection document. Customer.byId("12345")?.update({ // Updates the existing `name` field value. name: "Jonathan Doe" }) ``` | Reference: Collection name methods, document instance methods | | --- | --- | --- | ## [](#security-and-privileges)Security and privileges A [user-defined role](../../security/roles/) can assign privileges to a collection, including system collections. Collection privileges grant access to a collection’s documents. A collection privilege can allow the `create`, `delete`, `read`, or `write` actions. `read` access includes the ability to [call the collection’s indexes](../indexes/#call). An example [FSL role schema](../../../reference/fsl/role/): ```fsl role customer { // Grant read access to `Product` documents and indexes. privileges Product { read } } ``` You can also grant access to system collections that store Fauna resources: ```fsl role manager { // Grant `create` and `read` access to the `Token` system collection. // Allows the role to create token secrets. privileges Token { create read } } ``` To allow a role to create, delete, or manage user-defined collections themselves, grant access to the `Collection` system collection: ```fsl role manager { // Grant full access to the `Collection` system collection. // Allows the role to create, delete, read, and update // user-defined collections. privileges Collection { create delete read write } } ``` Built-in roles also have collection privileges. See [built-in roles](../../security/roles/). # Documents | Reference: | Document | | --- | --- | --- | --- | You add data to Fauna as JSON-like objects called documents. A document is a single, changeable record in a Fauna database. Each document belongs to a [collection](../collections/). All user data is stored in documents, and every entity in the Fauna data model, including [Database](../../../reference/fql/types/#database), [Collection](../../../reference/fql/types/#collection), and [Function](../../../reference/fql/types/#function), is defined in a document. Storing data in documents instead of rows and columns gives you greater flexibility. It allows you to shape your data in the way that best fits your applications instead of writing your applications to fit the data. Every record in a Fauna database is grouped and stored as a [Document](../../../reference/fql/types/#document) object, consisting of key:value pairs. A key can be a document. Data stored in a document looks similar to a JSON document and can include Strings, Integers, Arrays, and other [FQL data types](../../../reference/fql/types/). Documents include: * a timestamp * the name of their collection * a string-encoded integer used as a document ID. Documents changes create a new version of the document, which supports temporal querying. See the [Global limits](../../../reference/requirements-limits/#glimits) for more information on document size and query limits. ## [](#document-type)Document type A document’s data type is taken from its collection’s name. For example, `Product` for a document in the `Product` collection. This type is an instance of the [Document](../../../reference/fql/types/#document) type, which is a subtype of the [Object](../../../reference/fql/types/#object) type. You define the structure of a collection’s document type using the [collection schema’s](../../../reference/fsl/collection/): * [Field definitions](../../schema/#field-definitions) * [Wildcard constraint](../../schema/#wildcard-constraint) * [Computed fields](../../schema/#computed-fields) | See Document type definitions | | --- | --- | --- | ## [](#meta)Document metadata All documents have these common metadata fields: * Documents have a string-encoded 64-bit integer identifier. A document ID is a compound value of a collection identifier and a unique document ID. The ID is a unique identifier for the document in the scope of the database where it is stored. * When a document is updated, a new version is stored. User documents have a timestamp that identifies the most recent document update. Documents are versioned, and the versions are distinguished using a timestamp. When a query doesn’t specify a timestamp, the latest version of the document is used. The timestamp is returned in the document `ts` field. * The `ts` field shouldn’t be directly manipulated. To track timestamps independent of Fauna operations, include fields that are under your control in your documents to record timestamps. * `data` is a [reserved field](../../../reference/fql/reserved/#reserved-schema) that contains all user-defined fields and their values. By default, the `data` field isn’t returned in query results. However, if [typechecking](../../query/#static-typing) is disabled, you can [project](../../../reference/fql/projection/) the field to return it. The `data` field does not contain [computed fields](../../../reference/fsl/computed/) or metadata fields, such as `id`, `coll`, `ts`, or `ttl`. You can use the `data` field to safely nest user-defined fields that have [reserved field names](../../../reference/fql/reserved/#reserved-schema), such as `id` or `ttl`, in a document. See [Data field](https://docs.faunadb.org/fauna/v4/migration/migrate-to-v10/#data-field) and [Avoid conflicts with reserved fields](https://docs.faunadb.org/fauna/v4/migration/migrate-to-v10/#avoid-conflicts) in the v10 migration docs. * Documents have an optional `ttl` (time-to-live) field that indicates when the document should be removed. See [Document time-to-live (TTL)](../../doc-ttl/). ## [](#crud-operations-on-documents)CRUD operations on documents Every document object has the following methods: | Method | Description | | --- | --- | --- | --- | | document.delete() | Deletes the document, returning the id and coll in an object. | | document.exists() | Tests if a given document exists. | | document.replace() | Fully replaces the document data with the provided data. Fields are removed if they aren’t present in the provided data. | | document.update() | Updates the document with the provided data and returns the updated document. This does a patch update. Omitted fields are left as-is. To remove fields from a document, set the field value to null. | ## [](#document-references)Document references You can use [document references](../relationships/) to model relationships between documents, including documents in other collections. ```fql // Get a `Category` collection document. let produce = Category.byName("produce").first() // Create a `Product` document that references // the `Category` document. Product.create({ name: "key lime", description: "Organic, 1 ct", price: 79, // The `category` field includes a reference to // the `Category` document as a field value. category: produce, stock: 2000 }) ``` For more information, see [Model relationships using document references](../relationships/) | See Model relationships using document references | | --- | --- | --- | ## [](#nulldoc)NullDocs A [NullDoc](../../../reference/fql/types/#nulldoc) is a marker used to indicate that a document doesn’t exist or is inaccessible. A NullDoc’s data type is taken from its collection’s name. For example, a NullDoc for a `Product` collection document is `NullProduct`. NullDocs coalesce as a `null` value. Testing a `NullDoc` against a value of `null` returns `true`. Several FQL methods, such as [`collection.byId()`](../../../reference/fql-api/collection/instance-byid/), return a NullDoc for missing or inaccessible documents: ```fql // Attempts to access a `Product` collection document // with an `id` of `12345`. In this // example, the document doesn't exist. Product.byId("12345") ``` ``` // Returns a `NullProduct` value. Product("12345") /* not found */ ``` ### [](#dangling-refs)Dangling references Documents may contain [references](../relationships/) to [Nulldocs](../../../reference/fql/types/#nulldoc) — documents that don’t exist. These are called dangling references. For example: ```fql // Gets a `Product` collection document. // Use projection to return `name`, `description`, and `category` fields. Product.byId("111") { name, description, // The `category` field contains a reference to a `Category` collection document. category } ``` ``` { name: "cups", description: "Translucent 9 Oz, 100 ct", // If the referenced `Category` collection document doesn't exist, // the projection returns a NullDoc. category: Category("123") /* not found */ } ``` ### [](#check-for-a-documents-existence)Check for a document’s existence User-defined collection documents and system collection documents have an `exists()` method to test whether a referenced document exists. For example: ```fql // Checks if a `Product` collection // document with an ID of `111` exists. Product.byId("111").exists() // true // If the document doesn't exist, // `exists()` returns `false`. Product.byId("999").exists() // false ``` ### [](#exists-vs-null-comparisons)`exists()` vs. null comparisons You can use either `exists()` or a null comparison (`== null` or `!= null`) to check the existence or validity of a value. For example: ```fql Product.byId("111").exists() // true Product.byId("111") != null // true ``` Key differences: * `exists()` returns an error if called on an unsupported value. * Null comparisons do not throw errors and work safely on any value. For example: ```fql // Declare an object. Objects don't support // an `exists()` method. let object = { a: "Foo", b: "Bar" } object.exists() // Returns `invalid_query` error object != null // Returns true ``` # Sets | Reference: Set | | --- | --- | --- | A Set is an [FQL data type](../../../reference/fql/types/) that contains an iterable, unbounded group of values. You typically fetch [documents](../documents/) from a [collection](../collections/) as a Set. ## [](#coll)Get a Set of collection documents You can fetch a Set of documents from a collection by calling a [collection instance method](../../../reference/fql-api/collection/#instance-methods) that returns a Set. For example, you can call an [index](../indexes/) to get a filtered list of documents: ```fql // Uses the `Product` collection's `sortedByPriceLowToHigh()` // index to get `Product` documents with a // `price` greater than or equal to `199` ($1.99): Product.sortedByPriceLowToHigh({ from: 1_99 }) ``` ``` { // Returns matching `Product` documents as a Set: data: [ { id: "777", coll: Product, ts: Time("2099-08-20T18:18:47.100Z"), name: "limes", description: "Conventional, 16 oz bag", // $2.99 in cents price: 299, stock: 30, category: Category("789") }, ... ] } ``` ## [](#set-methods)Transform Sets with Set instance methods You can use [set instance methods](../../../reference/fql-api/set/#instance-methods) to transform a fetched Set. For example, you can: * Filter a Set using an [index](../indexes/) or [`set.where()`](../../../reference/fql-api/set/where/). * Sort a Set using an [index](../indexes/), [`set.order()`](../../../reference/fql-api/set/order/), or [`set.reverse()`](../../../reference/fql-api/set/reverse/). * Aggregate data from a Set using [`set.fold()`](../../../reference/fql-api/set/fold/) or [`set.reduce()`](../../../reference/fql-api/set/reduce/). ```fql // Uses the`Product` collection's `sortedByPriceLowToHigh()` // index and `where()` to get `Product` collection documents with: // - A `price` greater than or equal to `1_99` ($1.99) // - A `stock` greater than `50` Product.sortedByPriceLowToHigh({ from: 1_99 }).where(.stock > 50) ``` | Reference: Set instance methods | | --- | --- | --- | ## [](#set-operations)Set operations You can use [set instance methods](../../../reference/fql-api/set/#instance-methods) to perform SQL-like [set operations](https://en.wikipedia.org/wiki/Set_operations_\(SQL\)), such as unions, joins, and intersections, in FQL. For examples, see [Work with multiple Sets](../../query/patterns/sets/). | See Work with multiple Sets | | --- | --- | --- | ## [](#pagination)Pagination Fauna automatically paginates result Sets with 16 or more elements. When a query returns paginated results, Fauna materializes a subset of the Set with an `after` pagination cursor: ```fql // Uses the `Product` collection's `sortedByPriceLowToHigh()` index to // return all `Product` documents. // The collection contains more than 16 documents. Product.sortedByPriceLowToHigh() ``` ``` { // The result Set contains 16 elements. data: [ { id: "555", coll: Product, ts: Time("2099-07-30T15:57:03.730Z"), name: "single lime", description: "Conventional, 1 ct", price: 35, stock: 1000, category: Category("789") }, { id: "888", coll: Product, ts: Time("2099-07-30T15:57:03.730Z"), name: "cilantro", description: "Organic, 1 bunch", price: 149, stock: 100, category: Category("789") }, ... ], // Use the `after` cursor to get the next page of results. after: "hdW..." } ``` To get the next page of results, pass the `after` cursor to [`Set.paginate()`](../../../reference/fql-api/set/static-paginate/). To change the default page size, use [`set.pageSize()`](../../../reference/fql-api/set/pagesize/). | See Pagination | | --- | --- | --- | ## [](#set-refs)Set references [Sets](./) are not [persistable](../../../reference/fql/types/#persistable). You can’t store a Set as a field value or create a [field definition](../../schema/#field-definitions) that accepts a Set. Instead, you can use a [computed field](../../../reference/fsl/computed/) to define a read-only function that dynamically fetches a Set: ```fsl collection Customer { ... // Computed field definition for the `orders` field. // `orders` contains a reference to a Set of `Order` collection documents. // The value is computed using the `Order` collection's // `byCustomer()` index to get the customer's orders. compute orders: Set = ( customer => Order.byCustomer(customer)) ... } ``` If the field isn’t [projected](../../../reference/fql/projection/), it contains an [`after` pagination cursor](../../query/pagination/#cursor) that references the Set: ```fql // Get a `Customer` document. Customer.byEmail("alice.appleseed@example.com").first() ``` ``` { id: "111", coll: Customer, ts: Time("2099-10-22T21:56:31.260Z"), cart: Order("412483941752112205"), // `orders` contains an `after` cursor that // references the Set of `Order` documents. orders: "hdW...", name: "Alice Appleseed", email: "alice.appleseed@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } } ``` To materialize the Set, [project](../../../reference/fql/projection/) the computed field: ```fql let customer = Customer .where(.email == "alice.appleseed@example.com") .first() // Project the `name`, `email`, and `orders` fields. customer { name, email, orders } ``` ``` { name: "Alice Appleseed", email: "alice.appleseed@example.com", orders: { data: [ { id: "412483941752112205", coll: Order, ts: Time("2099-10-22T21:56:31.260Z"), items: "hdW...", total: 5392, status: "cart", customer: Customer("111"), createdAt: Time("2099-10-22T21:56:31.104083Z"), payment: {} }, ... ] } } ``` Alternatively, you can pass the `after` cursor to [`Set.paginate()`](../../../reference/fql-api/set/static-paginate/): ```fql Set.paginate("hdW...", 2) ``` ``` { // Returns a materialized Set of `Order` documents. data: [ { id: "412483941752112205", coll: Order, ts: Time("2099-10-22T21:56:31.260Z"), items: "hdW...", total: 5392, status: "cart", customer: Customer("111"), createdAt: Time("2099-10-22T21:56:31.104083Z"), payment: {} }, ... ] } ``` ## [](#set-vs-array)Sets vs. Arrays While both are iterable, Sets differ from FQL [arrays](../../../reference/fql/types/#array) as follows: | Difference | Set | Array | | --- | --- | --- | --- | --- | | Purpose | Typically represents a dynamic and potentially large Set of documents from a collection. | Represents a fixed sequence of known values. Limited to 16,000 elements. | | Order | Unordered. You order Sets using indexes and set instance methods. Elements don’t have index numbers. | Ordered. Each element has a specific index number. | | Pagination | Result Sets are paginated. See Pagination. | Arrays are not paginated. | | Persistable | Not persistable. A Set is dynamic and can’t reliably stored, retrieved, and updated in a Fauna database. | An Array of other persistable values is persistable. | | Supported methods | Supports set instance methods. | Supports array instance method. | | Loading strategy | Some Set instance methods lazily load the Set. Other Set instance methods eagerly load the Set. Lazy-loading methods only return a result Set when the query forces the Set to be materialized. | All Array instance methods eagerly load and materialize the entire Array. | ### [](#interop)Interoperability between Sets and Arrays For interoperability, most [set instance methods](../../../reference/fql-api/set/#instance-methods) have an equivalent [array instance method](../../../reference/fql-api/array/#instance-methods) (and the reverse). You can use the [`set.toArray()`](../../../reference/fql-api/set/toarray/) and [`array.toSet()`](../../../reference/fql-api/array/toset/) methods to cast between the Set and Array data types. # Indexes | Reference: Index definitions | | --- | --- | --- | An index stores, or covers, specific document field values for quick retrieval. You can use indexes to filter and sort a collection’s documents in a performant way. Using indexes can significantly improve query performance and reduce costs, especially for large datasets. Unindexed queries should be avoided. ## [](#define-an-index)Define an index You create and manage indexes at the collection level as part of an FSL [collection schema](../../schema/#collection-schema). An index definition can include: * **Terms**: Document fields for exact match searches * **Values**: Document fields for sorting and range searches You can only index [persistable](../../../reference/fql/types/#persistable) field values. An index definition must include at least one term or value. A collection can include multiple index definitions: ```fsl collection Product { ... // Defines the `byName()` index. index byName { // `terms` are document fields for exact match searches. // In this example, you get `Product` collection documents // by their `name` field value. terms [.name] // `values` are document fields for sorting and range searches. // In this example, you sort or filter index results by their // descending `stock` field value. values [desc(.stock)] } ... } ``` You can create and manage schema using any of the following: * The [Fauna CLI](../../schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../reference/http/reference/core-api/#tag/Schema) * [FQL schema methods](../../schema/manage-schema/#fql) ### [](#fql-index-definitions)FQL index definitions Fauna stores schema for user-defined collections as documents in the [`Collection`](../../../reference/fql-api/collection/) system collection. `Collection` documents include an FQL version of the collection’s index definitions: ``` { name: "Customer", coll: Collection, ts: Time("2099-10-03T20:45:53.780Z"), ... indexes: { byEmail: { terms: [ { field: ".email", mva: false } ], values: [ { field: ".email", order: "desc", mva: false }, { field: ".name", order: "asc", mva: false } ], queryable: true, status: "complete" }, }, ... } ``` You can use [Collection](../../../reference/fql-api/collection/) methods to access and manage index definitions in FQL. ### [](#builds)Index builds When you submit a new or updated collection schema, Fauna may need to build (or rebuild) the collection’s indexes. Fauna builds an index if: * The index definition is new. * The index definition is updated, including: * Adding a new index term or index value * Changing the order of existing index terms or index values * The definition, or function body, of a [computed field](../../../reference/fsl/computed/) covered by the index changes. Changing the name of a computed field does not trigger a rebuild. * A field that has a [field definition](../../schema/#field-definitions) and is covered by the index is [migrated](../../schema/#schema-migrations) in a way that affects the field’s values. Other migrations, such as renaming a field, don’t require a rebuild. If the collection contains more than 128 documents, Fauna uses a background task to build the index. During a build, the index may not be queryable. If the collection contains 128 or fewer documents, Fauna builds the index in the same transaction as the collection update. The index is immediately queryable. #### [](#monitor-index-builds)Monitor index builds The FQL version of the collection schema, stored as a [`Collection` document](../../../reference/fql-api/collection/), includes `status` and `queryable` properties for `indexes` objects. You can use these properties to monitor the availability of an index during a build: ```fql // Gets the FQL definition of the `Product` collection schema. // Projects the `indexes` object from the schema. Collection.byName("Product") { indexes } ``` ``` { indexes: { // FQL version of the `byCategory` index definition. byEmail: { terms: [ { field: ".email", mva: false } ], values: [ { field: ".email", order: "desc", mva: false }, { field: ".name", order: "asc", mva: false } ], queryable: true, // Indicates the index is queryable. status: "complete" // Indicates the index build is complete. }, ... } } ``` ## [](#call)Call an index In an FQL query, you call an index as a method on its collection: ```fql // Call the `byName()` index to fet `Product` collection // documents with a `name` value of `limes`. Values must // match exactly. Product.byName("limes") ``` The call returns a Set of matching collection documents. | Reference: FQL index method docs | | --- | --- | --- | ## [](#terms)Terms You can use index terms to run exact match searches on document field values. ### [](#exact-match)Use index terms for exact match search The following index definition includes `name` as an index term: ```fsl collection Product { ... index byName { terms [.name] } ... } ``` When you call the index, you must pass an argument for each term in the index definition. ```fql // Get products named "limes" Product.byName("limes") ``` The call returns a Set of `Product` collection documents with a `name` of `limes`. ### [](#pass-multiple-index-terms)Pass multiple index terms The following index definition includes two index terms: ```fsl collection Customer { ... index byName { terms [.firstName, .lastName] } } ``` In an index call, use a comma to separate term arguments. Provide arguments in the same field order used in the index definition. ```fql // Get customers named "Alice Appleseed" Customer.byName("Alice", "Appleseed") ``` The call returns a Set of matching collection documents. ### [](#best-practices-for-index-terms)Best practices for index terms Avoid using frequently updated fields as index terms. Internally, Fauna [partitions](#partitions) indexes based on its terms, if present. Frequent updates to term field values trigger updates to these partitions. If you need to filter or run an exact match search on a frequently updated field, consider adding the field as an index value instead: ```fsl collection Product { ... // Defines the `sortedByName()` index. // The index includes the `name` field as an index value. // `name` is a frequently updated field. index sortedByName { values [.name, .description, .price] } } ``` Then use the index to run a [range search](#range-search) on the index value: ```fql // Uses the `sortedByName()` index to run a range search // on `name` field values. The query only retrieves `Product` // collection documents with a `name` of `limes`. The query // is covered and avoids document reads. Product.sortedByName({ from: "limes", to: "limes" }) { name, description, price } ``` ## [](#values)Values You can use index values to sort a collection’s documents. You can also use index values for range searches. ### [](#sort-documents)Sort collection documents The following index definition includes several index values: ```fsl collection Product { ... index sortedByPriceLowToHigh { values [.price, .name, .description] } } ``` Call the `sortedByPriceLowToHigh()` index with no arguments to return `Product` documents sorted by: * Ascending `price`, then …​ * Ascending `name`, then …​ * Ascending `description`, then …​ * Ascending `id` (default) ```fql // Get products by ascending price, name, and description Product.sortedByPriceLowToHigh() ``` #### [](#descending-order)Descending order By default, index values sort results in ascending order. To use descending order, use `desc()` in the index definition: ```fsl collection Product { ... index sortedByPriceHighToLow { values [desc(.price), .name, .description] } ... } ``` Call the index with no arguments to return `Product` documents sorted by: * Descending `price`, then …​ * Ascending `name`, then …​ * Ascending `description`, then …​ * Ascending `id` (default) ```fql // Get products by descending price, // ascending name, and ascending description Product.sortedByPriceHighToLow() ``` ### [](#range-search)Run a range search You can also use index values for range searches. The following index definition includes several index values: ```fsl collection Product { ... index sortedByPriceLowToHigh { values [.price, .name, .description] } } ``` The index specifies `price` as its first value. The following query passes an argument to run a range search on `price`: ```fql // Get products with a price between // 20_00 (inclusive) and 30_00 (inclusive) Product.sortedByPriceLowToHigh({ from: 20_00, to: 30_00 }) ``` If an index value uses descending order, pass the higher value in `from`: ```fql // Get products with a price between // 20_00 (inclusive) and 30_00 (inclusive) in desc order Product.sortedByPriceHighToLow({ from: 30_00, to: 20_00 }) ``` Omit `from` or `to` to run unbounded range searches: ```fql // Get products with a price greater than or equal to 20_00 Product.sortedByPriceLowToHigh({ from: 20_00 }) // Get products with a price less than or equal to 30_00 Product.sortedByPriceLowToHigh({ to: 30_00 }) ``` ### [](#pass-multiple-index-values)Pass multiple index values Use an Array to pass multiple value arguments. Pass the arguments in the same field order used in the index definition. ```fql Product.sortedByPriceLowToHigh({ from: [ 20_00, "l" ], to: [ 30_00, "z" ] }) ``` The index returns any document that matches the first value in the `from` and `to` Arrays. If matching documents have the same values, they are compared against the next Array element value, and so on. For example, the `Product` collection’s `sortedByPriceLowToHigh()` index covers the `price` and `name` fields as index values. The `Product` collection contains two documents: | Document | price | name | | --- | --- | --- | --- | --- | | Doc1 | 4_99 | pizza | | Doc2 | 6_98 | cups | The following query returns both Doc1 and Doc2, in addition to other matching documents: ```fql Product.sortedByPriceLowToHigh({ from: [4_99, "p"] }) ``` The first value (`4_99` and `6_98`) of each document matches the first value (`4_99`) of the `from` Array. Later, you update the document values to: | Document | price | name | | --- | --- | --- | --- | --- | | Doc1 | 4_99 | pizza | | Doc2 | 4_99 | cups | The following query no longer returns Doc2: ```fql Product.sortedByPriceLowToHigh({ from: [4_99, "p"] }) ``` Although the first value (`4_99`) in both documents matches the first value in the `from` Array, the second value (`cups`) in Doc2 doesn’t match the second value (`p`) of the `from` Array. ### [](#run-a-range-search-on-id)Run a range search on `id` All indexes implicitly include an ascending document `id` as the index’s last value. If you intend to run range searches on `id`, we recommend you explicitly include an ascending `id` as the last index value in the index definition, even if you have an otherwise identical index. For example, the following `sortByStock()` and `sortByStockandId()` indexes have the same values: ```fsl collection Product { ... index sortByStock { values [.stock] } index sortByStockandId { values [.stock, .id] } ... } ``` Although it’s not explicitly listed, `sortByStock()` implicitly includes an ascending `id` as its last value. To reduce your costs, Fauna only builds the `sortByStock()` index. When a query calls the `sortByStockandId()` index, Fauna uses the `sortByStock()` index behind the scenes. `sortByStockandId()` only acts as a [virtual index](#virtual-indexes) and isn’t materialized. ## [](#pass-terms-and-values)Pass terms and values If an index has both terms and values, you can run an exact match search on documents in a provided range. The following index definition includes `name` as an index term and `stock` as an index value: ```fsl collection Product { ... index byName { terms [.name] values [.stock] } ... } ``` When you call the index, you must provide a term and can specify an optional range: ```fql // Get products named "donkeypinata" // with a stock between 10 (inclusive) and 50 (inclusive) Product.byName("donkey pinata", { from: 10, to: 50 }) ``` ## [](#mva)Index an Array field By default, Fauna assumes index term and value field contain scalar values. Use `mva()` to index an [Array](../../../reference/fql/types/#array) field’s values: ```fsl collection Product { ... index byCategory { // `categories` is an Array field. terms [mva(.categories)] } index sortedByCategory { // `categories` is an Array field. values [mva(.categories)] // You can combine `mva()` with // `desc()` and `asc()`. Ex: // values [desc(mva(.categories))] } ... } ``` `mva()` only works on the last item in the provided field accessor. For more complex nested Arrays, such as an object Array, use a [computed field](../../../reference/fsl/computed/): ```fsl collection Order { ... // `Order` collection documents // have the following structure: // { // customer: Customer(""), // items: [ // { // product: Product("PRODUCT_DOC_ID"), // quantity: 10 // }, // ... // ], // ... // } // Defines the `quantities` computed field. // The field uses `map()` to extract `price` values // from the `cart` Array's object to a flat Array. compute quantities = (.items.map(item => item.quantity)) // Uses `mva()` to index the computed `quantities` field. index byQuantities { terms [mva(.quantities)] } ... } ``` ## [](#covered-queries)Covered queries If you [project](../../../reference/fql/projection/) or [map](../../../reference/fql-api/set/map/#project) an index’s covered term or value fields, Fauna gets the field values from the index. The following index definition includes several index values: ```fsl collection Product { ... index sortedByPriceLowToHigh { values [.price, .name, .description] } } ``` The following is a covered query: ```fql // This is a covered query. // `name`, `description`, and `price` are values // in the `sortedByPriceLowToHigh()` index definition. Product.sortedByPriceLowToHigh() { name, description, price } ``` If the projection contains an uncovered field, Fauna must retrieve the field values from the documents. This is an uncovered query: ```fql // This is an uncovered query. // `stock` is not one of the terms or values // in the `sortedByPriceLowToHigh()` index definition. Product.sortedByPriceLowToHigh() { name, stock } ``` Performance hint: `non_covered_document_read` Uncovered queries emit a [performance hint](../../../reference/http/reference/query-summary/#perf), if enabled. For example: ``` performance_hint: non_covered_document_read - .stock is not covered by the Product.sortedByPriceLowToHigh index. See https://docs.faunadb.org/performance_hint/non_covered_document_read. at *query*:1:42 | 1 | Product.sortedByPriceLowToHigh() { name, stock } | ^^^^^ | ``` Covered queries are typically faster and less expensive than uncovered queries, which require document reads. If you frequently run uncovered queries, consider adding the uncovered fields to the index definition’s [`values`](#sort-documents). For example: ```fsl collection Product { ... // Adds the `stock` field as an index value. index sortedByPriceLowToHigh { values [.price, .name, .description, .stock] } } ``` ### [](#no-proj)No projection or mapping Index queries without a [projection](../../../reference/fql/projection/) or [mapping](../../../reference/fql-api/set/map/#project) are uncovered. Fauna must read each document returned in the Set. For example: ```fql // This is an uncovered query. // Queries without a projection or mapping // require a document read. Product.byName("limes") ``` Performance hint : `non_covered_document_read` If [performance hints](../../../reference/http/reference/query-summary/#perf) are enabled, index queries without a projection or mapping emit a performance hint. For example: ``` performance_hint: non_covered_document_read - Full documents returned from Product.byName. See https://docs.faunadb.org/performance_hint/non_covered_document_read. at *query*:1:15 | 1 | Product.byName("limes") | ^^^^^^^^^ | ``` If you frequently run such queries, consider adding the uncovered fields to the index definition’s [`values`](#sort-documents). For example: ```fsl collection Product { ... index byName { terms [.name] values [.price, .stock, .description] } ... } ``` Then use projection or mapping to only return the fields you need. Given the previous index definition, the following query is covered: ```fql // This is a covered query. // `price`, `stock`, and `description` are values // in the `byName()` index definition. Product.byName("limes") { price, stock, description } ``` ### [](#filter)Filter covered values You can use [`set.where()`](../../../reference/fql-api/set/where/) to filter the results of an [index call](#call). If the [`set.where()`](../../../reference/fql-api/set/where/) predicate only accesses fields defined in the index definition’s `terms` and `values`, the query is [covered](#covered-queries). For example, given the following index definition: ```fsl collection Product { ... index byName { terms [.name] values [.price, .description] } ... } ``` The following query is covered: ```fql // Covered query. // Calls the `byName()` index. // Uses `where()` to filter the results of // the index call. The predicates only // access covered terms and values. Product.byName("limes") .where(.description.includes("Conventional")) .where(.price < 500) { name, description, price } ``` The following query is uncovered: ```fql Product.byName("limes") .where(.description.includes("Conventional")) // The `where()` predicate accesses the uncovered // `stock` field. .where(.stock < 100) .where(.price < 500) { name, description, price } ``` To cover the query, add the uncovered field to the index definition’s `values`: ```fsl collection Product { ... index byName { terms [.name] // Adds `stock` to the index's values values [.price, .description, .stock] } ... } ``` ### [](#dynamic-filtering-using-advanced-query-composition)Dynamic filtering using advanced query composition Complex applications may need to handle arbitrary combinations of search criteria. In these cases, you can use [query composition](../../query/composition/) to dynamically apply [indexes](./) and [filters](../../query/patterns/sets/#filters) to queries. The following template uses query composition to: * Automatically select the most selective index * Apply remaining criteria as filters in priority order * Support both index-based and filter-based search patterns The template uses TypeScript and the [JavaScript driver](../../../build/drivers/js-client/). A similar approach can be used with any [Fauna client driver](../../../build/drivers/). ```typescript /** * A Javascript object with a sorted list of indexes or filters. * * Javascript maintains key order for objects. * Sort items in the map from most to least selective. */ type QueryMap = Record Query> /** Object to represent a search argument. * * Contains the name of the index to use and the arguments * to pass to it. * * Example: * { name: "by_name", args: ["limes"] } * { name: "range_price", args: [{ from: 100, to: 500 }] } */ type SearchTerm = { name: string args: any[] } /** * Composes a query by prioritizing the most selective index and then * applying filters. * * @param default_query - The initial query to which indexes and filters are applied. * @param index_map - A map of index names to functions that generate query components. * @param filter_map - A map of filter names to functions that generate query components. * @param search_terms - An array of search terms that specify the type and arguments * for composing the query. * @returns The composed query after applying all relevant indices and filters. */ const build_search = ( default_query: Query, index_map: QueryMap, filter_map: QueryMap, search_terms: SearchTerm[] ): Query => { const _search_terms = [...search_terms] // Initialize a default query. Used if no other indexes are applicable. let query: Query = default_query // Iterate through the index map, from most to least selective. build_index_query: for (const index_name of Object.keys( index_map )) { // Iterate through each search term to check if it matches the highest priority index. for (const search_term of _search_terms) { // If a match is found, update the query. Then remove the search term from the // list and break out of the loop. if (index_name === search_term.name) { query = index_map[search_term.name](...search_term.args) _search_terms.splice(_search_terms.indexOf(search_term), 1) break build_index_query } } } // Iterate through the filter map, from most to least selective. for (const filter_name of Object.keys(filter_map)) { // Iterate through each search term to check if it matches the highest priority filter. for (const search_term of _search_terms) { // If a match is found, update the query. Then remove the search term from the list. if (filter_name === search_term.name) { const filter = filter_map[search_term.name](...search_term.args) query = fql`${query}${filter}` _search_terms.splice(_search_terms.indexOf(search_term), 1) } } } // If there are remaining search terms, you can't build the full query. if (_search_terms.length > 0) { throw new Error("Unable to build query") } return query } ``` The following example implements the template using the [Fauna Dashboard](https://dashboard.fauna.com/)'s demo data: ```typescript // Implementation of `index_map` from the template. // Sort items in the map from most to least selective. const product_index_priority_map: QueryMap = { by_order: (id: string) => fql`Order.byId(${id})!.items.map(.product!)`, by_name: (name: string) => fql`Product.byName(${name})`, by_category: (category: string) => fql`Product.byCategory(Category.byName(${category}).first()!)`, range_price: (range: { from?: number; to?: number }) => fql`Product.sortedByPriceLowToHigh(${range})`, } // Implementation of `filter_map` from the template. // Sort items in the map from most to least selective. const product_filter_map: QueryMap = { by_name: (name: string) => fql`.where(.name == ${name})`, by_category: (category: string) => fql`.where(.category == Category.byName(${category}).first()!)`, range_price: ({ from, to }: { from?: number; to?: number }) => { // Dynamically filter products by price range. if (from && to) { return fql`.where(.price >= ${from} && .price <= ${to})` } else if (from) { return fql`.where(.price >= ${from})` } else if (to) { return fql`.where(.price <= ${to})` } return fql`` }, } // Hybrid implementation of `index_map` and `filter_map` from the template. // Combines filters and indexes to compose FQL query fragments. // Sort items in the map from most to least selective. const product_filter_with_indexes_map: QueryMap = { by_name: (name: string) => fql`.where(doc => Product.byName(${name}).includes(doc))`, by_category: (category: string) => fql`.where(doc => Product.byCategory(Category.byName(${category}).first()!).includes(doc))`, range_price: (range: { from?: number; to?: number }) => fql`.where(doc => Product.sortedByPriceLowToHigh(${range}).includes(doc))`, } const order_id = (await client.query(fql`Order.all().first()!`)) .data.id const query = build_search( fql`Product.all()`, product_index_priority_map, product_filter_with_indexes_map, [ // { type: "by", name: "name", args: ["limes"] }, // { type: "by", name: "category", args: ["produce"] }, { type: "range", name: "price", args: [{ to: 1000 }] }, { type: "by", name: "order", args: [order_id] }, ] ) const res = await client.query(query) ``` ### [](#null-values-are-uncovered)Null values are uncovered Missing or `null` field values are not [stored or covered by an index](#covered-queries), even if the field is listed as one of the `values` in the index’s definition. [Projecting](../../../reference/fql/projection/) or [mapping](../../../reference/fql-api/set/map/#project) a field with a `null` value requires a document read. For example, the following `byName()` index definition includes the `description` field as an index value: ```fsl collection Product { ... index byName { terms [.name] values [.price, .description] } } ``` The following query creates a document that omits the `description` field, which is equivalent to a `null` value for the field: ```fql Product.create({ name: "limes", price: 2_99 // The `description` field is omitted (effectively `null`). }) ``` If you use `byName()` to retrieve the indexed `name`, `price`, and `description` field values, the query is uncovered. A document read is required to retrieve the `null` value of the `description` field. ```fql Product.byName("limes") { name, price, // Projects the `description` field. description } ``` ``` { data: [ { name: "limes", price: 299, // Retrieving the `description` field's `null` value // requires a document read. description: null } ] } ``` ## [](#index-document-relationships)Index document relationships When you [index](./) a field that contains a document, you index a document reference. The reference consists of the document’s collection and document ID: ```fsl collection Product { ... // The `category` field contains a reference to // a `Category` collection document. category: Ref ... // Indexes the `category` field as an index term. // The index stores `Category` document references. // Example reference: Category("123") index byCategory { terms [.category] } } ``` An index can’t store a referenced document’s fields. An index also can’t store a computed field that references another document. See [Patterns to avoid](../relationships/#patterns-to-avoid). ## [](#missing-or-null-values)Missing or null values * **Terms:** If an index definition contains terms, Fauna doesn’t index a document if all its index terms are missing or otherwise evaluate to null. This applies even if the document contains index values. * **Values:** If an index definition contains only values, Fauna indexes all documents in the collection, regardless of whether the document’s index values are missing or otherwise null. ## [](#partitions)Partitions When an index has one or more terms, the index is internally partitioned by its terms. Partitioning lets Fauna scale indexes efficiently. ## [](#virtual-indexes)Virtual indexes To reduce your costs, Fauna doesn’t build duplicate indexes that have the same terms and values. Instead, Fauna only builds a single index and internally points any duplicates to the single index. For example, in the following collection, the `byDescription()` and `byDesc()` indexes are duplicates: ```fsl collection Product { ... index byDescription { terms [.description] } index byDesc { terms [.description] } } ``` When a query calls the `byDesc()` index, Fauna uses the existing `byDescription()` index internally. `byDesc()` is considered a virtual index and is never materialized. ## [](#history)Document history To support [temporal queries](../../doc-history/#temporal-query), indexes cover field values from both current documents and their [historical document snapshots](../../doc-history/). To enable quicker [sorting](#sort-documents) and [range searches](#range-search), current and historical index entries are stored together, sorted by index `values`. All indexes implicitly include an ascending [document `id`](../documents/#meta) as the index’s last value. When you read data from an index, including the [`collection.all()`](../../../reference/fql-api/collection/instance-all/) index, Fauna must read from both current and historical index entries to determine if they apply to the query. Fauna then filters out any data not returned by the query. You are charged for any Transactional Read Operations (TROs) used to read current or historical index data, including data not returned by the query. You are not charged for any historical data older than the retention period set by the [`history_days` setting](../../doc-history/#history-retention). # Model relationships using document references This guide covers how to model relational data in Fauna using document references. It covers: * Relational data in Fauna and how it’s structured * How to define, create, and resolve document references * How to model complex relationships using document references * Patterns to use and avoid when working with document references ## [](#relational-data)Relational data Relational data represents connections between different pieces of data in your database. In Fauna, you can model these relationships in two ways: * Storing a reference to a related document, similar to a foreign key in a traditional relational database. * Embedding related data directly in a parent document. This guide focuses on document references. For information about embedding, see [Embedding vs. document references](../best-practices/#embed). Using document references enables complex data modeling without duplicating data. The approach combines the ease of use of document databases with the data modeling capabilities of a traditional relational database. ## [](#create)Create a document relationship You can use document references to create relationships between documents. You can then use projection to dynamically resolve document references on read. ### [](#define)Define a document relationship You can define and enforce document relationships using typed [field definitions](../../schema/#field-definitions) in a [collection schema](../../schema/): ```fsl collection Product { ... // The `category` field accepts a reference to // a `Category` collection document. category: Ref ... } ``` ### [](#instantiate)Instantiate the relationship To instantiate the relationship, include a document reference as a field value: ```fql // Get a `Category` collection document. let produce = Category.byName("produce").first() // Create a `Product` document that references // the `Category` document. Product.create({ name: "key lime", description: "Organic, 1 ct", price: 79, // The `category` field includes a reference to // the `Category` document as a field value. category: produce, stock: 2000 }) ``` Fauna stores the field value as a document reference. The reference acts as a pointer to the document. The reference contains the document’s collection and document ID. If the field is not [projected](../../../reference/fql/projection/), the reference is returned on read: ``` // An example `Product` collection document. { id: "412568482109981184", coll: Product, ts: Time("2099-10-23T20:20:15.150Z"), name: "key lime", description: "Organic, 1 ct", price: 79, // A `Category` document reference. category: Category("789"), stock: 2000 } ``` ### [](#resolve)Resolve a document reference [Project](../../../reference/fql/projection/) the field to automatically resolve a document reference on read. Resolving a reference materializes the referenced document in results. ```fql // Get a `Product` document and project the // `name`, `description`, and `category` fields. Product.byName("key lime").first() { name, description, category { id, name, description } } ``` ``` { name: "key lime", description: "Organic, 1 ct", // The projection resolves the `Category` document // reference in the `category` field. category: { id: "789", name: "produce", description: "Fresh Produce" } } ``` You can use [projection](../../../reference/fql/projection/) to resolve multiple, deeply nested relationships in a single query: ```fql // Get a `Customer` document. let customer = Customer.byId("111") // Use the `Order` collection's `byCustomer()` index to get // `Order` documents based on their `customer` value. The // previous `Customer` document is passed to the index call. Order.byCustomer(customer) { // The `customer` field references the `Customer` document. customer { name, email }, // The `items` field references a Set of `OrderItem` documents. items { // Each `OrderItem` document references a nested `Product` document. product { name, description, price }, quantity }, total, status } ``` ``` { data: [ { // Resolves the `Customer` collection document in // the `customer` field. customer: { name: "Alice Appleseed", email: "alice.appleseed@example.com" }, // Resolves the Set of `OrderItem` collection documents in // the `items` field. items: { data: [ { // Resolves nested `Product` documents in // `OrderItem` documents. product: { name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698 }, quantity: 2 }, { product: { name: "donkey pinata", description: "Original Classic Donkey Pinata", price: 2499 }, quantity: 1 }, { product: { name: "pizza", description: "Frozen Cheese", price: 499 }, quantity: 3 } ] }, total: 5392, status: "cart" } ] } ``` ### [](#create-document-references-with-the-http-api)Create document references with the HTTP API When transmitting FQL data, the [Fauna Core HTTP API](../../../reference/http/reference/core-api/) encodes FQL data types as JSON using one of two data formats: * [Tagged format](../../../reference/http/reference/wire-protocol/#tagged): Tags JSON values with FQL type annotations, ensuring lossless typing. * [Simple format](../../../reference/http/reference/wire-protocol/#simple): Lossy format that converts FQL values to their closest JSON type, without annotations or transformations. The Core API’s [Query endpoint](../../../reference/http/reference/core-api/) uses the simple format by default. The simple format represents [document references](./) as lossy JSON objects. These objects can’t be used to create FQL document references directly: ```json // Document reference in // the simple format. "category": { "id": "111", "coll": "Category" } ``` For example, the following Query endpoint request sets the `category` field value to an object, not a document reference: ```bash # INCORRECT: # The following request does NOT create # a document reference. Instead, it sets # the `category` field value to an object. curl -X POST \ "https://db.fauna.com/query/1" \ -H "Authorization: Bearer $FAUNA_SECRET" \ -H "Content-Type: application/json" \ -H "X-Format: simple" \ -d '{ "query": "Product.byName(\"limes\").first()?.update({ \"category\": { \"id\": \"111\", \"coll\": \"Category\" } })" }' ``` #### [](#use-fql-methods)Use FQL methods To create a document reference, use an FQL method, such as [`collection.byId()`](../../../reference/fql-api/collection/instance-byid/), that returns a document or document reference and set the returned value as the field value: ```bash # CORRECT: # The following request uses `byId()` # to set the `category` field value to # a `Category` document reference. curl -X POST \ "https://db.fauna.com/query/1" \ -H "Authorization: Bearer $FAUNA_SECRET" \ -H "Content-Type: application/json" \ -H "X-Format: simple" \ -d '{ "query": "Product.byName(\"limes\").first()?.update({ \"category\": Category.byId(\"111\") })" }' ``` If needed, you can interpolate any needed arguments, such as the document ID: ```bash # CORRECT: # The following request uses interpolated arguments. # It's equivalent to the previous query. curl -X POST \ "https://db.fauna.com/query/1" \ -H "Authorization: Bearer $FAUNA_SECRET" \ -H "Content-Type: application/json" \ -H "X-Format: simple" \ -d '{ "query": "Product.byName(\"limes\").first()?.update({ \"category\": Category.byId(arg) })", "arguments": { "arg": "111" } }' ``` #### [](#use-the-tagged-format)Use the tagged format Alternatively, you can use the [tagged format](../../../reference/http/reference/wire-protocol/#tagged) to pass in the document reference as an interpolated argument in the query request: ```bash # DO: # The following request uses the tagged format # to pass the document reference into the FQL # query as an interpolated argument. curl -X POST \ "https://db.fauna.com/query/1" \ -H "Authorization: Bearer $FAUNA_SECRET" \ -H "Content-Type: application/json" \ -H "X-Format: tagged" \ -d '{ "query": "Product.byName(\"limes\").first()?.update({ \"category\": arg })", "arguments": { "arg": { "@ref": { "id": "111", "coll": { "@mod": "Category" } } } } }' ``` Fauna’s [client drivers](../../../build/drivers/) use the tagged format to preserve FQL typing when serializing to and deserializing from JSON. ## [](#complex-rel)Model a complex relationship You can use [field definitions](../../schema/#field-definitions) and [computed fields](../../../reference/fsl/computed/) to model complex document relationships. ### [](#array)Document Arrays A field definition can accept documents as part of an Array: ```fsl collection Order { ... // The `items` field accepts an Array of references to // `OrderItem` collection documents. items: Array> ... } ``` ### [](#nested-obj)Nested objects A field definition can accept documents as part of a nested object: ```fsl collection Customer { ... // Defines the `preferences` object field. preferences: { // Defines the nested `store` field. // The `store` field accepts a reference to // a `Store` collection document. store: Ref, emailList: Boolean, ... } ... } ``` ### [](#set-refs)Set references [Sets](../sets/) are not [persistable](../../../reference/fql/types/#persistable). You can’t store a Set as a field value or create a [field definition](../../schema/#field-definitions) that accepts a Set. Instead, you can use a [computed field](../../../reference/fsl/computed/) to define a read-only function that dynamically fetches a Set: ```fsl collection Customer { ... // Computed field definition for the `orders` field. // `orders` contains a reference to a Set of `Order` collection documents. // The value is computed using the `Order` collection's // `byCustomer()` index to get the customer's orders. compute orders: Set = ( customer => Order.byCustomer(customer)) ... } ``` If the field isn’t [projected](../../../reference/fql/projection/), it contains an [`after` pagination cursor](../../query/pagination/#cursor) that references the Set: ```fql // Get a `Customer` document. Customer.byEmail("alice.appleseed@example.com").first() ``` ``` { id: "111", coll: Customer, ts: Time("2099-10-22T21:56:31.260Z"), cart: Order("412483941752112205"), // `orders` contains an `after` cursor that // references the Set of `Order` documents. orders: "hdW...", name: "Alice Appleseed", email: "alice.appleseed@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } } ``` To materialize the Set, [project](../../../reference/fql/projection/) the computed field: ```fql let customer = Customer .where(.email == "alice.appleseed@example.com") .first() // Project the `name`, `email`, and `orders` fields. customer { name, email, orders } ``` ``` { name: "Alice Appleseed", email: "alice.appleseed@example.com", orders: { data: [ { id: "412483941752112205", coll: Order, ts: Time("2099-10-22T21:56:31.260Z"), items: "hdW...", total: 5392, status: "cart", customer: Customer("111"), createdAt: Time("2099-10-22T21:56:31.104083Z"), payment: {} }, ... ] } } ``` Alternatively, you can pass the `after` cursor to [`Set.paginate()`](../../../reference/fql-api/set/static-paginate/): ```fql Set.paginate("hdW...", 2) ``` ``` { // Returns a materialized Set of `Order` documents. data: [ { id: "412483941752112205", coll: Order, ts: Time("2099-10-22T21:56:31.260Z"), items: "hdW...", total: 5392, status: "cart", customer: Customer("111"), createdAt: Time("2099-10-22T21:56:31.104083Z"), payment: {} }, ... ] } ``` ### [](#comp-field)Use computed fields to create query-based relationships You can use a [computed field](../../../reference/fsl/computed/) to create a relationship between documents based on a read-only query: ```fsl collection Customer { ... // Computed field definition for the `cart` field. // `cart` contains an `Order` collection document. // The value is computed using the `Order` collection's // `byCustomerAndStatus()` index to get the first order // for the customer with a `cart` status. compute cart: Order? = (customer => Order.byCustomerAndStatus(customer, 'cart').first()) ... } ``` ## [](#types)Relationship types You can use document references to model the following relationship types: | Relationship type | Definition | Example | | --- | --- | --- | --- | --- | | One-to-one | A document in one collection has only one associated document in another collection. | A book has one author. | | One-to-many | A document in a collection is associated with one or more documents in another collection. | An author writes many books. | | Many-to-many | A document in one collection is associated with multiple other documents in another collection. A document in the other collection is associated with multiple documents in the first collection. | A reader has many books, and a book has many readers. | ### [](#one-to-one)One-to-one relationship A one-to-one relationship exists when a document in one collection has only one associated document in another collection. ```fql // Products have a one-to-one relationships with // its category. Each product has one category. // Get a `Category` collection document let produce = Category.byName("produce").first() // Create a `Product` document // that references the `Category` document. Product.create({ name: "key lime", description: "Organic, 1 ct", price: 79, category: produce, stock: 2000 }) { name, description, category } ``` ### [](#one-to-many)One-to-many relationship A one-to-many relationship exists when a document in a collection is associated with one or more documents in another collection. ```fql // Categories have a one-to-many relationship with // products. A category can be assigned to multiple // products. // Get a `Category` collection document let produce = Category.byName("produce").first() // Define an Array that contains data for // multiple `Product` documents. Each `Product` // document references the previous `Category` document. let products = [ { name: "key limes", description: "Organic, 2 ct", price: 1_39, // `category` contains the `Category` document reference. category: produce, stock: 1000 }, { name: "lemons", description: "Organic, 3 ct", price: 1_39, category: produce, stock: 1000 } ] // Use `map()` to create `Product` documents // from the previous Array. products.map(product => Product.create({ product // Project `name`, `description`, and resolved // `category` fields of the `Product` documents. })) { name, description, category } ``` ### [](#many-to-many)Many-to-many relationships A many-to-many relationship exists when a document in one collection is associated with multiple other documents in another collection and the reverse. In Fauna, creating a many-to-many relationship requires a third collection to track the associations: ```fsl // The `OrderItem` collection creates many-to-many // relationships between `Order` and `Product` documents. // A product can be included with multiple orders. // An order can contain multiple products. collection OrderItem { order: Ref product: Ref quantity: Int unique [.order, .product] } ``` The following query instantiates the many-to-many relationship: ```fql // Defines data for `Order` collection documents. let orderData = [ { customer: Customer.byId('111'), status: "processing", createdAt: Time.now() }, { customer: Customer.byId('222'), status: "processing", createdAt: Time.now() } ] // Defines data for `Product` collection documents. let productData = [ { name: "kiwis", description: "Organic, 2 ct", price: 2_39, // `category` contains the `Category` document reference. category: Category.byName("produce").first(), stock: 1000 }, { name: "oranges", description: "Organic, 3 ct", price: 3_39, category: Category.byName("produce").first(), stock: 1000 } ] // Creates `Order` and `Product` documents using // the previous data. let orders = orderData.map(doc => Order.create(doc)) let products = productData.map(doc => Product.create(doc)) // Create `OrderItem` documents for // each order and product. orders.flatMap(order => products.map(product => OrderItem.create({ order: order, product: product, quantity: 1 }) ) // Return the resolved `order`, `product`, and `quantity` // fields for each `OrderItem` document. ) { order, product, quantity } ``` ## [](#index)Index document relationships When you [index](../indexes/) a field that contains a document, you index a document reference. The reference consists of the document’s collection and document ID: ```fsl collection Product { ... // The `category` field contains a reference to // a `Category` collection document. category: Ref ... // Indexes the `category` field as an index term. // The index stores `Category` document references. // Example reference: Category("123") index byCategory { terms [.category] } } ``` An index can’t store a referenced document’s fields. An index also can’t store a computed field that references another document. See [Patterns to avoid](#patterns-to-avoid). ### [](#query-index)Query indexed document relationships You can’t run a [covered query](../indexes/#covered-queries) on an indexed document reference. Projection resolves the document reference, which requires a read of the document. For example: ```fql // An uncovered query. // The `category` field contains a document reference, // which can't be covered. Product.byCategory(Category.byId("123")) { category } ``` Using indexes and [computed fields](../../../reference/fsl/computed/) can make queries on document relationships more readable and convenient. See [Patterns to use](#patterns-to-use). ## [](#patterns-to-use)Patterns to use You can’t use an [index](../indexes/) to run cover queries on document relationships. However, you can use indexes and [computed fields](../../../reference/fsl/computed/) to make queries on document relationships more readable and convenient. ### [](#exact-match-doc-ref)Run an exact match search on a document reference Use a document as an index term to run exact match searches on a document reference. For example: 1. Define an index as part of a [collection schema](../../schema/): ```fsl // Defines the `Product` collection. collection Product { ... // The `category` field contains a reference to // a `Category` collection document. category: Ref ... // Defines the `byCategory()` index. // Use the index to get `Product` collection // documents by `category` value. In this case, // `category` contains `Category` collection documents. index byCategory { terms [.category] } } ``` 2. Use the index in a projection to fetch and resolve document references: ```fql // Get a `Category` collection document. let produce = Category.byName("produce").first() produce { id, name, // Use the `byCategory()` index to get // all products for the category. products: Product.byCategory(produce) { id, name, description, } } ``` ``` { id: "789", name: "produce", products: { data: [ { id: "444", name: "avocados", description: "Conventional Hass, 4ct bag" }, { id: "555", name: "single lime", description: "Conventional, 1 ct" }, ... ] } } ``` ### [](#simplify-projectsion-computed-fields)Simplify projections with computed fields You can call an index in a [computed field](../../../reference/fsl/computed/) to simplify projections in queries that resolve document relationships. The following extends the previous example: 1. Define the [collection schema](../../schema/): ```fsl // Defines the `Product` collection. collection Product { ... // The `category` field contains a reference to // a `Category` collection document. category: Ref ... // Defines the `byCategory()` index. // Use the index to get `Product` collection // documents by `category` value. index byCategory { terms [.category] } } // Defines the `Category` collection. collection Category { ... // Defines the `all_products` computed field. // The field calls the `Product` collection's // `byCategory()` index. compute products: Set = ( category => Product.byCategory(category) ) } ``` 2. Update the previous query’s projection to use the computed field: ```fql // Get a `Category` collection document. let produce = Category.byName("produce").first() produce { id, name, // Project the `products` computed field instead of // directly calling the `byCategory()` index. products { id, name, description, } } ``` ``` // The results are the same as the previous query. { id: "789", name: "produce", products: { data: [ { id: "444", name: "avocados", description: "Conventional Hass, 4ct bag" }, { id: "555", name: "single lime", description: "Conventional, 1 ct" }, ... ] } } ``` | See FSL collection schema: Computed field definitions | | --- | --- | --- | ### [](#index-array)Index an Array of document references Use `mva()` (multi-value attribute) to index an [array](../../../reference/fql/types/#array) of document references. For example: 1. Include the [array](../../../reference/fql/types/#array) field in an index definition. Wrap the field accessor in `mva()`: ```fsl // Defines the `Store` collection. collection Store { ... // The `product` field contains an Array of references to // `Product` collection documents. products: Array> ... // Defines the `byProduct()` index. // Use the index to get `Store` collection // documents by `products` value. index byProduct { terms [mva(.products)] } } ``` 2. Use the index in a projection to query: ```fql // Gets a `Product` collection document. let product = Product.byName("avocados").first() // Uses projection to return `id`, `name`, and `stores` fields // for the `Product` collection document. product { id, name, // Uses the `byProduct()` index to get all stores // that contain the product in the `products` field. stores: Store.byProduct(product) { id, name, products } } ``` ``` { id: "444", name: "avocados", stores: { data: [ { id: "12345", name: "DC Fruits", products: [ { id: "444", coll: Product, ts: Time("2024-11-01T17:11:54.200Z"), name: "avocados", description: "Conventional Hass, 4ct bag", price: 399, stock: 1000, category: Category("789") }, { id: "555", coll: Product, ts: Time("2024-11-01T17:11:54.200Z"), name: "single lime", description: "Conventional, 1 ct", price: 35, stock: 1000, category: Category("789") } ] }, ... ] } } ``` ## [](#patterns-to-avoid)Patterns to avoid An [index](../indexes/) can’t store a referenced document’s fields. Avoid index definitions that attempt to index these fields. ### [](#dont-index-fields)Don’t index fields of referenced documents Don’t attempt to use an index to store fields of a referenced document: ```fsl // Defines the `Customer` collection. collection Customer { ... // The `address` field contains a reference to a // `Address` collection document. address: Ref
.... // INCORRECT: // Fauna can't index the previous `Address` document's // `city` field. The `city` field references another document. index byCity { terms [.address.city] } } // Defines the `Address` collection. collection Address { street: String city: String } ``` ### [](#dont-index-comp-fields)Don’t index computed fields that reference other documents Don’t index a computed field that references another document: ```fsl // Defines the `Customer` collection. collection Customer { ... // The `address` field contains a reference to a // `Address` collection document. address: Ref
.... // The `city` computed field gets the previous // `Address` document's `city` field. compute city = ((customer) => customer.address.city) // INCORRECT: // Fauna can't index the computed `city` field. // The field references another document. index byCity { terms [.city] } } collection Address { street: String city: String } ``` ### [](#dont-index-doc-ids)Don’t index the IDs of document references When a field contains a document reference, index the field rather than the referenced document’s ID: ```fsl // Defines the `Product` collection. collection Product { // The `category` field contains a reference to // a `Category` document. category: Ref // CORRECT: index byCategory { terms [.category] } // INCORRECT: // Fauna can't index the previous `Category` document's `id` field. index byCategoryId { terms [.category.id] } } ``` ## [](#delete)Delete document relationships To delete a document relationship, remove the field that contains the document reference. Removing the field does not delete the referenced document. For example: ```fql // Updates a `Product` collection document. Product.byId("111")?.update({ // Removes the `category` field, which contains a // reference to a `Category` collection document. // Removing the `category` field does not delete // the `Category` document. category: null }) ``` ### [](#dangling-refs)Dangling references Deleting a document does not remove its inbound [document references](./). Documents may contain [references](./) to [Nulldocs](../../../reference/fql/types/#nulldoc) — documents that don’t exist. These are called dangling references. For example: ```fql // Gets a `Product` collection document. // Use projection to return `name`, `description`, and `category` fields. Product.byId("111") { name, description, // The `category` field contains a reference to a `Category` collection document. category } ``` ``` { name: "cups", description: "Translucent 9 Oz, 100 ct", // If the referenced `Category` collection document doesn't exist, // the projection returns a NullDoc. category: Category("123") /* not found */ } ``` ### [](#cascading-delete)Perform a cascading delete A cascading delete is an operation where deleting a document in one collection automatically deletes related documents in other collections. Fauna doesn’t provide automatic cascading deletes for user-defined collections. Instead, you can use an index and [`set.forEach()`](../../../reference/fql-api/set/foreach/) to iterate through a document’s relationships. In the following example, you’ll delete a `Category` collection document and any `Product` documents that reference the category. 1. Define an index as part of a [collection schema](../../schema/): ```fsl collection Product { ... category: Ref ... // Defines the `byCategory()` index. // Use the index to get `Product` collection // documents by `category` value. In this case, // `category` contains a reference to a `Category` collection document. index byCategory { terms [.category] } } ``` 2. Use the index and [`set.forEach()`](../../../reference/fql-api/set/foreach/) to delete the category and any related products: ```fql // Gets a `Category` collection document. let category = Category.byId("333") // Gets `Product` collection documents that // contain the `Category` document in the `category` field. let products = Product.byCategory(category) // Deletes the `Category` collection document. category?.delete() // Deletes `Product` collection documents that // contain the `Category` document in the `category` field. products.forEach(.delete()) // Returns `null` ``` # Data modeling best practices This guide covers best practices for modeling data in Fauna. ## [](#use-indexes-for-commonly-accessed-data)Use indexes for commonly accessed data [Indexes](../indexes/) are the most important and effective tool to increase performance and reduce the cost of your queries. Avoid [uncovered queries](../indexes/#covered-queries) whenever possible. To reduce document reads, include any frequently queried fields in indexes. | See Indexes | | --- | --- | --- | ## [](#avoid-storing-unneeded-history)Avoid storing unneeded history A [collection schema](../../schema/#collection-schema)'s [`history_days`](../../doc-history/#history-retention) setting defines the number of days of history to retain as document snapshots. You can use these historical snapshots to run [temporal queries](../../doc-history/#temporal-query) or replay events in [event feeds and event streams](../../cdc/). Avoid storing unnecessary history. A high `history_days` setting has several impacts: * **Increased read ops:** To support [temporal queries](../../doc-history/#temporal-query), indexes cover field values from both current documents and their [historical document snapshots](../../doc-history/). To enable quicker [sorting](../indexes/#sort-documents) and [range searches](../indexes/#range-search), current and historical index entries are stored together, sorted by index `values`. All indexes implicitly include an ascending [document `id`](../documents/#meta) as the index’s last value. When you read data from an index, including the [`collection.all()`](../../../reference/fql-api/collection/instance-all/) index, Fauna must read from both current and historical index entries to determine if they apply to the query. Fauna then filters out any data not returned by the query. You are charged for any Transactional Read Operations (TROs) used to read current or historical index data, including data not returned by the query. You are not charged for any historical data older than the retention period set by the [`history_days` setting](../../doc-history/#history-retention). * **Longer index build times:** Because indexes include historical data, a high `history_days` setting can increase the [index build times](../indexes/#builds). * **Increased query latency on indexes:** If an indexed field value changes frequently, the index must retain more historical data. A high `history_days` setting can increase query latency on the index. * **Increased storage:** More document snapshots and historical index data is retained, consuming additional database storage and increasing storage costs. ## [](#use-computed-fields-to-reduce-storage)Use computed fields to reduce storage If you’re storing a large amount of data, you can use [computed fields](../../../reference/fsl/computed/) to reduce storage where applicable. Computed fields aren’t part of the original document or persistently stored. Instead, the field’s value is computed on each read. To avoid unneeded computes on read, use [projection](../../../reference/fql/projection/) to only request computed fields when needed in queries. | See FSL collection schema: Computed field definitions | | --- | --- | --- | ## [](#use-schema-to-progressively-enforce-document-types)Use schema to progressively enforce document types Use a collection schema’s document type to check the presence and type of document field values on write. You use document types to enforce enumerated field values and allow arbitrary ad hoc fields. If your application’s data model changes, you can use [zero-downtime migrations](../../schema/#schema-migrations) to add field definitions for ad hoc fields and normalize field values. This lets you move from a permissive document type to strict one (or the reverse). | See Schema | | --- | --- | --- | ## [](#validate-data-with-constraints)Validate data with constraints Use constraints to validate field values using predefined rules. For example, you can use a [unique constraint](../../../reference/fsl/unique/) to ensure each end user has a unique email address. Similarly, you can use a [check constraint](../../../reference/fsl/check/) to apply other business logic. For example, you can ensure: * `age` field values are greater than zero * Projects are scheduled in the future * Purchases don’t reduce a user’s `balance` to a negative number | See FSL collection schema: Unique constraint definitions, FSL collection schema: Check constraint definitions | | --- | --- | --- | ## [](#use-ttl-for-document-retention)Use `ttl` for document retention Use the optional `ttl` (time-to-live) document metadata field to automatically clean up completed or obsolete documents. `ttl` Sets an expiration timestamp for the document. Set a default retention period for a collection’s documents using the [collection schema](../../schema/)'s `ttl_days` field. | See Document time-to-live (TTL) | | --- | --- | --- | ## [](#for-multi-tenant-apps-use-a-child-database-per-tenant)For multi-tenant apps, use a child database per tenant You can use FQL queries or the Fauna CLI to programmatically create a child database per tenant. Databases are instantly allocated. Using child databases lets you build multi-tenant applications with strong isolation guarantees. Each database is logically isolated from its peers, with separate access controls. You can manage all tenants from a single parent database. ## [](#use-cicd-to-manage-schema-across-databases)Use CI/CD to manage schema across databases An FSL schema is scoped to a single database and doesn’t apply to its peer or child databases. If you have a multi-tenant application, you can copy and deploy schema across databases using FSL and a CI/CD pipeline. See [Manage schema with a CI/CD pipeline](../../schema/manage-schema/#cicd). ## [](#concurrent)Avoid concurrent schema changes Concurrent [unstaged schema changes](../../schema/manage-schema/#unstaged) can cause [contended transactions](../../transactions/contention/), even if the changes affect different resources. This includes unstaged changes made using: * The [Fauna CLI](../../schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../reference/http/reference/core-api/#tag/Schema) * [FQL schema methods](../../schema/manage-schema/#fql) A schema change triggers a [transaction that validates the entire database schema](../../schema/#validation). To avoid errors, do one of the following instead: * Run [staged schema changes](../../schema/manage-schema/#staged) * Perform unstaged schema changes sequentially ## [](#embed)Modeling relational data Relational data represents connections between different pieces of data in your database. In Fauna, you can model these relationships in two ways: * Storing a [reference to a related document](../relationships/), similar to a foreign key in a traditional relational database. * Embedding related data directly in a parent document. ### [](#what-is-embedding)What is embedding? Embedding means storing non-scalar data directly inside a document, rather than in separate documents. This can include [Arrays](../../../reference/fql/types/#array), [Objects](../../../reference/fql/types/#object), or any composition of those two structures. For example, instead of creating separate documents for a customer’s address details, you might embed them directly in the customer document: ```fql Customer.create({ name: "Jane Doe", email: "jdoe@example.com", // Instead of creating a separate `Address` collection // document, address information is embedded directly // in the `address` field. address: { street: "5 Troy Trail", city: "Washington", state: "DC", postalCode: "20220", country: "US" } }) ``` ### [](#when-to-use-document-references)When to use document references Generally, we recommend using [document references](../relationships/) when: * The data is referenced across many documents. * The referenced data is frequently updated. * The relationship(s) may change. In most cases, using document references optimizes for faster, less expensive writes at the cost of slower, more expensive reads. See [Comparing document references and embedding](#compare). ### [](#when-to-use-embedding)When to use embedding We recommend using embedding when: * The referenced data is small. * The referenced data is tightly coupled to its parent document. * The parent document(s) and the referenced data are typically accessed together. In most cases, embedding optimizes for faster, less expensive read at the cost of slower, more expensive writes. See [Comparing document references and embedding](#compare). ### [](#mixing-approaches)Mixing approaches The choice to embed related data or use document references doesn’t affect your ability to constrain a [document type](../../schema/#document-type-definitions) using [schema](../../schema/). You can mix and match, such as using [field definitions](../../schema/#field-definitions) to constrain embedded data or storing document references in schemaless documents. ### [](#compare)Comparing document references and embedding The following table outlines the major differences between using [document references](../relationships/) and embedding and to model relational data. | Difference | Document references | Embedding | | --- | --- | --- | --- | --- | | Reads and indexing | Potentially slower and more expensive. Resolving document references requires a read of the document. You can’t index a referenced document’s field values. | Potentially faster and less expensive. Embedded field values can be indexed and retrieved without a document read. See Indexes. | | Writes | Potentially faster and less expensive. Updating a referenced document doesn’t affect documents that contain the reference. | Potentially slower and more expensive. Updating embedded data requires a rewrite of the entire parent document. | | Referential integrity | Easier to maintain referential integrity. The referenced document acts as a single source of truth. However, deleting the referenced document can create a dangling reference. | Risks violating referential integrity if the embedded data is duplicated across many documents and not kept in sync. | | Storage | Typically more efficient if the referenced data is shared across multiple documents. | Typically more efficient if the referenced data is tightly coupled with its parent document(s) and not duplicated across multiple documents. | ### [](#embedding-examples)Embedding examples #### [](#array-obj)Embed an Array of objects on one side of the relation To model one-to-many or many-to-many relationships, you can embed data as an array of objects: ```fql film.create({ title: "Academy Dinosaur", actors: [ { name: { first: "Penelope", last: "Guinness" } }, { name: { first: "Johnny", last: "Lollobrigida" } }, ] }) ``` By replacing an association table/collection with an embedded Array, the querying of the data becomes rather simple: ```fql // assuming you have a index `byTitle` film.byTitle("Academy Dinosaur").first() { actors } ``` This pattern satisfies the need for a many-to-many relationship. You can optimize for queries starting with the other side of the joins (start with actors, not films). ```fql // Unoptimized query film.where(.actors.map(.name.first).includes("Penelope")) ``` You can use ad-hoc filtering of the fields inside the Array of objects by introducing an index on a [computed field](../../../reference/fsl/computed/). ```fsl collection film { compute actorsByFirstName = (.actors.map(item => item.name.first)) index filmsByActor { terms [mva(.actorsByFirstName)] } } ``` Query to find all the films by an actor: ```fql // Optimized query film.filmsByActor("Penelope") { name } ``` | Advantages | Disadvantages | | --- | --- | --- | --- | | Efficient for both reads & writes. Lowest latency and simplified operations for both. | Storage likely increases due to data duplication. Although storage is cheap and the duplicated fields are small. | | Querying is flexible. You can start query from either side (film or actor). You can filter by either. All fields for the return Set are available from both sides. | Increased effort for updating values. You’d need to apply that to all locations if an actor’s name were to change. | | The least number of operations for both reads and writes (1), and the least compute effort. | Write concurrency. Updating both actors and films with high concurrency could cause contention. | #### [](#array-refs)Embed an Array of references on one side of the relation Instead of embedding the entire document in parent, store references in the parent document. ```fql film.create({ title: "Giant", actors: [ actor.byId("406683323649228873"), actor.byId("416683323649229986") ] }) ``` Querying the data is similar to the previous example: ```fql film.where(.actors.map(a => a == actor.byId("406683323649228873")).first()) { title } ``` | Advantages | Disadvantages | | --- | --- | --- | --- | | Less modeling complexity than association tables. | Requires more read Input/Output operations to gather query data. In this case each of the actors selected into the result Set would need an additional IO to gather their data. This would inflate the number of reads for a query from 1 per film to 1 per film plus 1 per actor. | | Overall storage should be about the same as an association table. | Indexing is no longer available on the embedded items' raw values, increasing query complexity. In this case a query starting from the actor side would need to use a nested query pattern (sub-queries). | | Updating the foreign record (actor in this example) is independent and fast. | | | Data duplication is less than the basic embedded pattern | | | Changing the Array of values in the parent document (list of actors in a film in our example) is optimized as it would be far less data to transfer and update. | | | Allows for the foreign data to change in the future (if we wanted to add fields to the actor’s data, like middle name, place of birth, etc) compared to the basic embedded pattern. | | #### [](#both-sides)Embed on both side of relations Another potential pattern for modeling many-to-many relationships with Fauna is to embed Arrays of references into the documents on both sides of the relationship. This approach is most suitable when relationships are relatively static. One main drawback of this pattern is data redundancy. Your document structure would look like this: ``` // film document { "id": "12323", "title": "The Great Adventure", "release_year": 2024, "genre": "Adventure", "actors": [ Ref("122"), Ref("123"), ] } // actor document { "id": "222", "name": "John Smith", "birthdate": "1980-05-20", "films": [ Ref("12323"), Ref("12324"), ] } ``` # Security This guide provides a high-level overview of Fauna’s security features and capabilities. ## [](#compliance)Compliance Fauna prioritizes security and compliance. Fauna is compliant with GDPR and SOC2 Type II. Fauna can be configured to meet HIPAA requirements. ## [](#data-encryption)Data encryption All Fauna connections use HTTPS. Connections must use Transport Layer Security (TLS) version 1.2 or better. This ensures point-to-point encryption between your Fauna and your client application. Data uploaded to Fauna is encrypted at rest. ## [](#authentication)Authentication Fauna uses stateless, token-based authentication. Every query is an independently secured request to the [Query HTTP API endpoint](../../reference/http/reference/core-api/#operation/query). Fauna supports several methods for creating authentication tokens, including integration with external identity providers (IdPs). | See Authentication | | --- | --- | --- | ## [](#authorization)Authorization Fauna supports both role-based access control (RBAC) and attribute-based access control (ABAC). In Fauna, you can use ABAC to dynamically change access at query time based on multiple attributes. For more control, you can choose to only allow data access through server-side [user-defined functions (UDFs)](../schema/user-defined-functions/). UDFs give you granular control over the way data is accessed and returned. | See Authorization | | --- | --- | --- | ## [](#multi-tenancy)Multi-tenancy A Fauna database can have many child databases. Child databases can have their own child databases. Each database is logically isolated from its peers, with separate access controls. Queries run in the context of a single database and can’t access data outside the database. This simplifies the process of building multi-tenant applications with strong isolation guarantees. You can copy and deploy roles across databases using `.fsl` files and a CI/CD pipeline. See [Manage schema with a CI/CD pipeline](../schema/manage-schema/#cicd). # Authentication This guide provides a high-level overview of authentication in Fauna. ## [](#secrets)Secrets In Fauna, every query is an independently authenticated request to the [Query HTTP API endpoint](../../../reference/http/reference/core-api/#operation/query). You authenticate with Fauna using secrets. Secrets are passed to the Fauna HTTP API as bearer tokens. Each secret is scoped to a specific database or an account’s top-level context. Fauna uses secrets to route requests. Fauna also uses secrets for [authorization](../authorization/). Each secret can have one or more [roles](../roles/). These roles determine the secret’s privileges, which control data access. You can use the same secret for multiple requests. A secret remains valid until it expires or is deleted. ### [](#secret-types)Secret types Fauna supports several [authentication methods](#authentication-methods) for creating secrets. Different authentication methods create different secret types. A secret’s type affects how the secret is assigned roles. A secret’s type also determines whether the secret is tied to an [identity document](#credentials). With [attribution-based access control (ABAC)](../abac/), you can use an identity document’s attributes to dynamically grant access to data. The following table outlines each secret type and their differences. | Secret type | Primary use | Authentication method | Role assignment | Multiple roles | Identity document for ABAC | | --- | --- | --- | --- | --- | --- | --- | --- | | JSON Web Token (JWT) | End-user authentication | Access providers | Dynamic | Yes | No | | Key | Anonymous access. Manage child databases. | None. Typically created by an admin.See Keys. | Static | No | No | | Token | End-user authentication | CredentialsYou can also use Token.create() to create tokens without using credentials. | Dynamic | Yes | Yes | An application can use multiple secrets, secret types, and authentication methods at the same time. ## [](#authentication-methods)Authentication methods Fauna supports two methods for end-user authentication: * [Access providers](#access-providers) * [Credentials](#credentials) You can use [keys](#keys) to provide anonymous access to a database. ### [](#access-providers)Access providers You can configure an external identity provider (IdP) or other JWT issuer, such as Auth0, as an access provider in your Fauna database. When a user logs in, the IdP issues a JWT. Your application can use the JWT as an [authentication secret](./). | See Access providers | | --- | --- | --- | ### [](#credentials)Credentials A credential associates an end-user password with a Fauna document that represents a user, system, or other identity. This document is called an identity document. You can use a credential to create tokens that contain an [authentication secret](./). The token’s secret is tied to the identity document. You can use the identity document’s attributes for dynamic [ABAC](../abac/). | See Credentials | | --- | --- | --- | ### [](#keys)Keys Keys provide anonymous access to a Fauna database. Unlike tokens, keys aren’t associated with an identity. You can use keys for system processes and applications that don’t require identity-based authentication. You can also use a key to bootstrap a Fauna-based [end-user authentication system](../../../build/tutorials/auth/). The key can provide the minimum access required for end users to sign up and log in to your application. You can use [scoped keys](../keys/#scoped-keys) from a parent database to manage and access child databases. | See Keys | | --- | --- | --- | ## [](#sessions)Sessions Fauna doesn’t use session-based authentication or maintain sessions on the server side. # Access providers An access provider registers an external identity provider (IdP), such as Auth0, in your Fauna database. Once set up, the IdP can issue JSON Web Token (JWTs) that act as Fauna [authentication secrets](../authentication/#secrets). This lets your application’s end users use the IdP for authentication. ## [](#supported)Supported identity providers You can use any application that issues JWTs and meets the [requirements](#reqs) as an access provider. Fauna has documented setup steps for the following IdPs: ![Auth0](../../../build/_images/integration/logos/auth0.png) [Auth0](../../../build/integration/auth0/) ![Clerk](../../../build/_images/integration/logos/clerk.svg) [Clerk](../../../build/integration/clerk/) ### [](#other-supported-integrations)Other supported integrations Although they don’t meet the [requirements](#reqs) to act as an access provider, you can use the following IdPs to issue [authentication tokens](../tokens/) for end users: ![Amazon Cognito](../../../build/_images/integration/logos/cognito.png) [Amazon Cognito](../../../build/integration/cognito/) ![Microsoft Entra](../../../build/_images/integration/logos/entra.svg) [Microsoft Entra](../../../build/integration/entra/) ### [](#reqs)Requirements To act as an access provider, an IdP must: * Issue JWTs with an `aud` (audience) and `iss` (issuer) claim. The `aud` claim must be configurable. Fauna uses the `aud` and `iss` claims to verify the source and intended audience of JWTs. * Sign its JWTs using the `RS256`, `RS384`, or `RS512` algorithms. The JWT header must specify the algorithm in the `alg` claim. * Provide a URI that points to public JSON web key sets (JWKS) that Fauna can use to verify the signature of the IdP’s JWTs. ## [](#create-manage-access-providers)Create and manage access providers You create and manage access providers as FSL [access provider schema](../../../reference/fsl/access-provider/): ```fsl access provider someIssuer { // `issuer` string for the IdP. // Must match the `iss` claim in JWTs produced by the IdP. issuer "https://example.com/" jwks_uri "https://example.com/.well-known/jwks.json" role customer } ``` You can create and manage schema using any of the following: * The [Fauna CLI](../../schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../reference/http/reference/core-api/#tag/Schema) * [FQL schema methods](../../schema/manage-schema/#fql) | Reference: FSL access provider schema | | --- | --- | --- | ### [](#access-provider-roles)Access provider roles When you define an access provider schema, you can specify one or more `role` properties with user-defined roles. Fauna assigns these roles to the provider’s JWTs. You can’t assign a built-in role to an access provider’s JWTs. Each `role` property can include a predicate to conditionally assign roles to JWTs. ```fsl access provider someIssuer { ... // Assign the `customer` role to the provider's JWTs. role customer // If the predicate is `true`, // assign the `manager` role to the provider's JWTs. role manager { // Check that the JWT payload's `scope` includes `manager`. predicate (jwt => jwt!.scope.includes("manager")) } } ``` | See Dynamically assign roles to JWTs | | --- | --- | --- | ### [](#config)Set up an access provider Setting up an access provider requires configuration in Fauna and the external IdP. To set up the access provider in Fauna, you must include the following information from the IdP in the access provider schema: * An `issuer` string that matches the `iss` claim in JWTs issued by the IdP. * A `jwks_uri` that points to a set of JWKS for the IdP’s JWTs. Fauna uses the keys to verify the signature of the JWTs. To issue valid Fauna JWTs, you must provide the IdP with an `audience` URL from Fauna. The `audience` URL is globally unique and scoped to a Fauna database. All access providers in a database use the same `audience` URL. The following procedure outlines the broad steps for setting up an access provider. The exact steps will vary based on the IdP: 1. Retrieve the `issuer` string and `jwks_uri` from the IdP. 2. Create an FSL access provider schema that includes the: * `name` for the access provider. * IdP’s `issuer`. Must be unique to the access provider. * IdP’s `jwks_uri`. Must be unique to the access provider. * Roles for the provider’s JWTs in one or more `role` properties. ```fsl access provider someIssuer { issuer "https://example.com/" jwks_uri "https://example.com/.well-known/jwks.json" role customer } ``` 3. Submit the schema to Fauna using any of the following: * The [Fauna CLI](../../schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../reference/http/reference/core-api/#tag/Schema) 4. Get the database’s `audience` URL from the Dashboard or using an FQL query. The `audience` URL consists of `https://db.fauna.com/db/` followed by the database’s [global id](../../data-model/databases/#global-id). To get the `audience` URL using FQL: ```fql AccessProvider.byName("someIssuer") ``` ``` { name: "someIssuer", coll: AccessProvider, ts: Time("2099-07-08T17:41:55.280Z"), jwks_uri: "https://example.com/.well-known/jwks.json", roles: [ "customer", { role: "manager", predicate: "(jwt) => jwt!.scope.includes(\"manager\")" } ], // The database's `audience` URL for access providers audience: "https://db.fauna.com/db/abc123", issuer: "https://example.com/" } ``` 5. Configure the IdP to include the `audience` URL in the `aud` claim of JWTs produced by the IdP. 6. Complete any other required configuration in the IdP. ### [](#how)How authentication works with an access provider. Once set up, the IdP issues a JWT when a user logs in to the IdP (or at another trigger). Your application can use the JWT as an [authentication secret](../authentication/). The following outlines how this authentication flow typically works: ![An overview of the sequence of events required to use external authentication](../../_images/sequence-jwt.svg) 1. An unauthenticated user visits your application and clicks "Log in". 2. Your application redirects to or otherwise opens a login form for the IdP. 3. The IdP presents the login form. 4. The user enters their credentials and submits them to the IdP. 5. If authentication is successful, the IdP makes a request to an endpoint for your application and provides a new JWT. 6. Your application indicates to the user that their login is successful. Your web application holds onto the JWT to use for subsequent queries to Fauna. 7. The user performs an action that requires fetching data from Fauna. 8. Your web application runs a query and uses the held JWT to authenticate the request. 9. Fauna validates the JWT. If needed, Fauna fetches the given public key from the `jwks_uri` to validate the JWT’s signature. Fauna only performs the validation step one time during the JWT validation interval, if provided, or one time per hour. If successful, Fauna returns the query’s results. 10. Your web application updates its UI for the user based on the response that it receives. ### [](#delete-access-provider)Delete an access provider You can delete an access provider using any of the following: * The [Fauna CLI](../../schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../reference/http/reference/core-api/#tag/Schema) * The FQL [`accessProvider.delete()`](../../../reference/fql-api/accessprovider/delete/) method. Deleting an access provider immediately invalidates any Fauna JWTs issued by the related IdP. ### [](#access-provider-system-coll)`AccessProvider` collection Fauna stores access providers as documents in the `AccessProvider` system collection. You can use [AccessProvider](../../../reference/fql-api/accessprovider/) methods to access `AccessProvider` collection documents in FQL. | See AccessProvider FQL docs | | --- | --- | --- | ## [](#jwt)JSON Web Tokens (JWTs) Once [set up](#config), you can use JWTs issued by an IdP as an [authentication secret](../authentication/#secrets) in Fauna. JWTs are commonly used by web applications for authorization and to transmit information. For more information about JWTs and their structure, see [https://jwt.io/introduction](https://jwt.io/introduction). ### [](#jwt-payload)JWT payload A JWT’s payload contains claims. A claim is a key and value in a JSON structure. For example: ```json { "iss": "https://fauna-auth0.auth0.com/", "sub": "google-oauth2|997696438605329289272", "aud": [ "https://fauna-auth0.auth0.com/userinfo", "https://db.fauna.com/db/abc123", ], "iat": 1602681059, "nbf": 1602681059, "exp": 1602767459, "azp": "12345abcdef", "scope": "openid profile email", } ``` The following table describes covers JWT claims and claims required by Fauna. | Claim | Required by Fauna | Description | | --- | --- | --- | --- | --- | | iss | true | Issuer of the JWT. Must match the issuer string in the access provider schema. | | sub | true | Subscriber or authenticated user identity.In the example, the sub claim is for an authenticated Google user whose identity is confirmed by the IdP, Auth0. | | aud | true | Audiences expected to validate and use the JWTIn the example, the aud claim includes two URLs:An IdP URL. The URL permits the JWT holder to request user information that isn’t included in the JWT.An audience URL for a Fauna database. | | iat | | The "issued at" timestamp for the JWT. Represented as seconds since the Unix epoch. | | nbf | | The "not before" timestamp for the JWT. Represented as seconds since the Unix epoch.You can’t use a JWT to authenticate Fauna requests before the nbf timestamp. | | exp | | Expiration timestamp for the JWT. Represented as seconds since the Unix epoch.You can’t use a JWT to authenticate Fauna requests after the exp timestamp. | | azp | | The "authorized party" party to which the JWT is issued. This is typically an ID for the user in the IdP. | | scope | | Space-delimited list of scopes. Fauna doesn’t use or process JWT scopes.You can include arbitrary scopes for use in attribute-based access control (ABAC). For an example, see Dynamically assign roles to JWTs. | ### [](#access-jwt-payload)Access a JWT’s payload in a query If you use a JWT to authenticate an FQL query, you can access the JWT’s payload using [`Query.token()`](../../../reference/fql-api/query/token/): ```fql Query.token() ``` ``` { iss: "https://faunadb-auth0.us.auth0.com/", sub: "6dSyciWo7pKrarUCgFxzxi545oWfgyEk@clients", aud: "https://db.fauna.com/db/abc123", iat: 1720536267, exp: 1720622667, scope: "manager", azp: "12345abcdef" } ``` | Reference: Query.token() | | --- | --- | --- | ### [](#update-jwt-roles)Update a JWT’s roles Fauna assigns roles to a JWT based on the [`role` properties](#access-provider-roles) in the related access provider’s schema. Fauna checks a JWT’s roles and related privileges at query time for every query. To update a token’s roles, edit the `role` properties in the schema. Changes to roles and privileges take effect immediately and affect pre-existing JWTs. ### [](#check-token-roles)Check a JWT’s roles You can use [user-defined functions (UDFs)](../../schema/user-defined-functions/) to check a JWT’s roles. See [Check a secret’s user-defined roles](../../query/patterns/check-secret-roles/). ### [](#jwt-expiration)JWT expiration JWTs with an `exp` (expiration) claim can’t be used after the `exp` timestamp. JWTs with an `nbf` (not before) claim can’t be used before the `nbf` timestamp. Fauna doesn’t require the `exp` or `nbf` claims. If desired, you must configure your IdP to include the claims in its JWTs. ### [](#jwt-scope)JWT scope JWTs are scoped to a specific Fauna database based on the Fauna `audience` URL in the `aud` claim. You can’t use a JWT to access parents or peers of this database. # Tokens | Reference: Token | | --- | --- | --- | A token is a type of [authentication secret](../authentication/#secrets) used to provide identity-based access to a Fauna database. You typically create and use tokens as part of a Fauna-based [end-user authentication system](../../../build/tutorials/auth/). ## [](#token-system-coll)`Token` collection Fauna stores tokens as documents in the `Token` system collection. You can use [Token](../../../reference/fql-api/token/) methods to access `Token` collection documents in FQL. | See Token FQL docs | | --- | --- | --- | ## [](#identity-document)Identity documents Each token is associated with an identity document that represents an end user, system, or other identity. The identity document is distinct from the `Token` document. Any document in any user-defined collection can act as an identity document. For example: ``` // Example `Customer` collection document that acts // as an identity document. { id: "111", coll: Customer, ts: Time("2099-07-08T13:39:55.690Z"), name: "Alice Appleseed", email: "alice.appleseed@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", zipCode: "20220" }, ... } ``` ### [](#multiple-id-doc-collections)Multiple identity document collections A common approach is to use different collections for different types of identity documents. This often makes [role membership](#token-roles) easier to manage. For example, you can use the `Customer` collection to store identity documents for customer end users. In the same database, you can use the `Manager` collection to store identity documents for manager end users. ## [](#token-roles)Token roles Fauna assigns [user-defined roles](../roles/) to tokens based on their identity document’s collection and the `membership` property of the related role’s schema: ```fsl // Schema for the user-defined `customer` role. role customer { // Assign the `customer` role to tokens with // identity documents in the `Customer` collection. membership Customer } ``` You can use membership to assign a token multiple roles. You can also use [membership predicates](../roles/#membership-predicates) to conditionally assign roles to tokens. You can’t assign a built-in role to a token. | See Role membership. | | --- | --- | --- | ### [](#update-token-roles)Update a token’s roles Fauna checks a token’s roles and related privileges at query time for every query. To update a token’s roles or privileges, edit the membership and privileges for the related roles. Changes to roles and privileges take effect immediately and affect pre-existing tokens. ### [](#check-token-roles)Check a token’s roles You can use [user-defined functions (UDFs)](../../schema/user-defined-functions/) to check a token’s roles. See [Check a secret’s user-defined roles](../../query/patterns/check-secret-roles/). ### [](#token-scope)Token scope Each token is scoped to a specific Fauna database. You can’t use a token to access parent or peer databases. ## [](#credentials)Credentials A credential associates an end-user password with an [identity document](#identity-document). Fauna stores credentials as documents in the `Credential` system collection. You can use [Credential](../../../reference/fql-api/credential/) methods to access `Credential` collection documents in FQL. | See Credential FQL docs | | --- | --- | --- | ## [](#create-manage-tokens)Create and manage tokens You create and manage tokens using FQL queries. Fauna supports two methods for token creation: * [Using a credential](#create-token-credential), which is useful for end-user authentication. * [Without a credential](#create-token-no-credential), which is useful for servers, services, and other non-user identities. ### [](#create-token-credential)Create a token with a credential The following procedure outlines the steps for creating a token with a credential. For clarity, the procedure outlines each step separately. In production, you’d typically bundle and encapsulate the steps in [user-defined functions (UDFs)](../../schema/user-defined-functions/). For an example, see [Build an end-user authentication system](../../../build/tutorials/auth/). 1. If needed, create an identity document: ```fql // Creates `Customer` collection document. // The collection contains identity documents for // customer end users. Customer.create({ name: "John Doe", email: "jdoe@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } }) ``` 2. Create a credential for the identity document: ```fql // Declares a `customer` variable. // Uses the `Customer` collection's `byEmail()` index to // get `Customer` collection documents by `email` field value. // In the `Customer` collection, `email` field values are unique // so return the `first()` (and only) document. let customer = Customer.byEmail("jdoe@example.com").first() // Creates a credential for the previous `Customer` // identity document. In this example, the `password` is a // provided by the customer end user. Credential.create({ document: customer, password: "sekret" }) ``` An identity document can only have one credential. 3. Call [`credential.login()`](../../../reference/fql-api/credential/login/) to create a token using the credential and its password: ```fql let customer = Customer.byEmail("jdoe@example.com").first() // Uses `byDocument()` to get the credential for // the previous `Customer` identity document. let credential = Credential.byDocument(customer) // Calls the `login()` with the credential's password. credential?.login("sekret") ``` The returned `Token` document includes the token’s `secret`, which you can use to authenticate with Fauna. A token’s secret is shown once — when you create the token. ``` { id: "12345", coll: Token, ts: Time("2099-07-08T14:34:15.520Z"), // Token's secret secret: "fn...", // `document` contains the token's identity document. document: Customer("412651138948530688") } ``` 4. If wanted, you can call [`credential.login()`](../../../reference/fql-api/credential/login/) multiple times to create multiple tokens using the same credential. | Reference: credential.login() | | --- | --- | --- | ### [](#create-token-no-credential)Create a token without a credential The following procedure outlines the steps for creating a token without a credential or related password. 1. If needed, create an identity document: ```fql // Creates `Customer` collection document. // The collection contains identity documents for // customer end users. Customer.create({ name: "John Doe", email: "jdoe@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } }) ``` 2. Call [`Token.create()`](../../../reference/fql-api/token/create/) to create a token using the identity document: ```fql // Declares a `customer` variable. // Uses the `Customer` collection's `byEmail()` index to // get `Customer` collection documents by `email` field value. // In the `Customer` collection, `email` field values are unique // so return the `first()` (and only) document. let customer = Customer.byEmail("jdoe@example.com").first() // Creates a token for the previous `Customer` // identity document. Token.create({ document: customer }) ``` The returned `Token` document includes the token’s `secret`, which you can use to authenticate with Fauna. A token’s secret is shown once — when you create the token. ``` { id: "12345", coll: Token, ts: Time("2099-07-08T14:34:15.520Z"), // Token's secret secret: "fn...", // `document` contains the token's identity document. document: Customer("111") } ``` 3. If wanted, you can call [`Token.create()`](../../../reference/fql-api/token/create/) multiple times to create multiple tokens using the same identity document. | Reference: Token.create() | | --- | --- | --- | ### [](#multiple-requests)Multiple requests You can use the same token secret to authenticate multiple Fauna requests. A secret remains valid until it expires or is deleted. ### [](#token-expiration)Token expiration A `Token` document can include an optional `ttl` (time-to-live) field that contains the token’s expiration timestamp. You can set this `ttl` when you create a token using [`credential.login()`](../../../reference/fql-api/credential/login/): ```fql let customer = Customer.byEmail("alice.appleseed@example.com").first() let credential = Credential.byDocument(customer) // Set the token's `ttl` to 60 minutes from the current time at // query. The token's secret expires at its `ttl`. credential?.login("fauna-demo", Time.now().add(60, "minutes")) ``` ``` { id: "412652552989966848", coll: Token, ts: Time("2099-07-08T14:34:15.520Z"), ttl: Time("2099-07-06T19:28:51.499944Z"), secret: "fn...", document: Customer("111") } ``` You can also set `ttl` using [Token](../../../reference/fql-api/token/) methods: ```fql Token.byId("412652552989966848")?.update({ // Set the token's `ttl`. ttl: Time.now().add(60, "minutes") }) ``` After the `ttl` timestamp passes, Fauna permanently deletes the token and its secret. You can’t use an expired token’s secret to authenticate requests. A `Token` document without a `ttl` does not expire and persists until deleted. ### [](#recover-token)Recover a lost token secret You can’t recover or regenerate a lost token secret. After creation, `Token` documents don’t include the token’s secret. Instead, delete the token and create a new one. ## [](#multiple-tokens)Access end-user data in a query If you use a token secret to authenticate an FQL query, you can access the token’s identity document or `Token` document in the query. You can use [ABAC](../abac/) to dynamically grant roles and privileges based on attributes of these documents. ### [](#access-a-tokens-identity-document)Access a token’s identity document Use [`Query.identity()`](../../../reference/fql-api/query/identity/) to get the identity document for a query’s authentication token: ```fql Query.identity() ``` ``` { id: "111", coll: Customer, ts: Time("2099-06-21T18:39:00.735Z"), cart: Order("413090255209497088"), orders: "hdW...", name: "Alice Appleseed", email: "alice.appleseed@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } } ``` | Reference: Query.identity() | | --- | --- | --- | ### [](#access-a-token-document)Access a token document Use [`Query.token()`](../../../reference/fql-api/query/token/) to get the `Token` document for a query’s authentication token: ```fql Query.token() ``` ``` { id: "12345", coll: Token, ts: Time("2099-07-06T19:11:13.570Z"), document: Customer("111") } ``` | Reference: Query.token() | | --- | --- | --- | # Keys | Reference: Key | | --- | --- | --- | A key is a type of [authentication secret](../authentication/#secrets) used for anonymous access to a Fauna database. Unlike [tokens](../tokens/), keys are not associated with an identity. You can use keys for system processes and applications that don’t require identity-based authentication. You can also use a key to bootstrap a Fauna-based [end-user authentication system](../../../build/tutorials/auth/). The key can provide the minimum access required for end users to sign up and log in to your application. ## [](#role)Key role Each key is assigned a single [role](../roles/). The role can be built-in or user-defined. The role determines the key secret’s privileges, which control data access. | See Roles | | --- | --- | --- | ## [](#key-scope)Scope Keys are scoped to a specific database or an account’s top-level context. If a key is scoped to a database, you can use [scoped keys](#scoped-keys) to access the database’s child databases and their descendants. You can’t use a key to access parent or peer databases. If a key is scoped to an account’s top-level context, you can use [scoped keys](#scoped-keys) to access all databases in the account. ## [](#coll)`Key` collection Fauna stores keys scoped to a database as documents in the database’s `Key` system collection. You can use [Key](../../../reference/fql-api/key/) methods to access `Key` collection documents in FQL. Keys scoped to an account’s top-level context are not stored as documents. | See Key FQL docs | | --- | --- | --- | ## [](#create-manage-keys)Create and manage keys You can create and manage keys using: * The [Fauna Dashboard](https://dashboard.fauna.com/) * [FQL `Key` methods](../../../reference/fql-api/key/) For example, the following FQL query creates a key with the built-in `admin` role: ```fql Key.create({ // Creates a key with the built-in `admin` role. role: "admin" }) ``` The result includes the key’s `secret`, which you can use to authenticate with Fauna. A key’s secret is shown once — when you create the key. ``` { id: "402596308273070146", coll: Key, ts: Time("2099-07-05T18:36:49.090Z"), // Key's secret secret: "fn...", role: "admin" } ``` You can use the same secret to authenticate multiple Fauna requests. A secret remains valid until it expires or is deleted. ### [](#update-a-keys-role)Update a key’s role You can update a key’s role using the [`key.update()`](../../../reference/fql-api/key/update/) method: ```fql Key.byId("402596308273070146")?.update({ // Changes the key's role to the built-in `server` role. role: "server" }) ``` Fauna checks a key’s role and related privileges at query time for every query. | Reference: key.update() | | --- | --- | --- | ### [](#ttl)Key expiration A `Key` document can include an optional `ttl` (time-to-live) field that contains the key’s expiration timestamp: ```fql Key.create({ role: "admin", // Creates a key that expires in one day. ttl: Time.now().add(1, "days") }) ``` ``` { id: "402599582390812736", coll: Key, ts: Time("2099-07-05T19:28:51.520Z"), // The `Key` document includes a `ttl` timestamp. ttl: Time("2099-07-06T19:28:51.499944Z"), role: "admin", secret: "fn..." } ``` After the `ttl` timestamp passes, Fauna permanently deletes the key and its secret. You can’t use an expired key’s secret to authenticate requests. A `Key` document without a `ttl` does not expire and persists until deleted. ### [](#key-secret)Recover a lost key secret You can’t recover or regenerate a lost key secret. After creation, `Key` documents don’t include the key’s secret. Instead, delete the key and create a new one. ## [](#scoped-keys)Scoped keys A scoped key lets you use a parent database’s admin key to send query requests to its child databases or descendants. This provides a secure way to: * Access child databases without managing multiple keys * Test user-defined roles and end-user privileges * Impersonate other secrets you could create with the existing admin key You create a scoped key by appending parameters to an existing key’s secret. The existing key must have the built-in `admin` role. For example: * `fn…​:childDB:admin` creates a scoped key with the `admin` role for the `childDB` child database. * `fn…​:test/performance:server` creates a scoped key with the `server` role for the `performance` database within the `test` child database. The following section outlines the syntax and examples for scoped key types. ### [](#impersonate-a-built-in-role)Impersonate a built-in role You typically use this type of scoped key to impersonate access to a child database. #### [](#syntax)Syntax \[:\]: #### [](#example)Example ``` // Scoped key that impersonates the `server` role // for the database scoped to the secret. fn...:server // Scoped key that impersonates the `admin` role for // the `child_db` child database. fn...:child_db:admin // Scoped key that impersonates the `admin` role for // the `grand_child_db` database, nested under `child_db`. fn...:child_db/grand_child_db:admin ``` #### [](#parameters)Parameters | Parameter | Required | Description | | --- | --- | --- | --- | --- | | | true | Key secret. | | | | Name of a child database. To access a nested child database, separate the database names by /. For example, use test/performance to access the performance database within the test child database. | | | true | Built-in role. Accepts admin, server, or server-readonly. | ### [](#impersonate-an-end-user)Impersonate an end user You typically use this type of scoped key to impersonate a [token](../tokens/) for a specific end user or another identity document. Impersonated tokens can be assigned multiple user-defined roles through [role membership](../roles/#role-membership). You can use the scoped key to test role membership for tokens or [role-related predicates](../abac/). #### [](#syntax-2)Syntax \[:\]:@doc// #### [](#example-2)Example ``` // Scoped key that impersonates a token with a `Customer` // identity document in the database scoped to the secret. fn...:@doc/Customer/123 // Scoped key that impersonates a token with a `Manager` // identity document in the `child_db` child database. fn...:child_db:@doc/Manager/456 // Scoped key that impersonates a token with a `Owner` // identity document in the `grand_child_db` database, // nested under `child_db`. fn...:child_db/grand_child_db:@doc/Owner/789 ``` #### [](#parameters-2)Parameters | Parameter | Required | Description | | --- | --- | --- | --- | --- | | | true | Key secret. | | | | Name of a child database. To access a nested child database, separate the database names by /. For example, use test/performance to access the performance database within the test child database. | | | true | User-defined collection in the database. If a is provided, this is a collection in the child database.The collection typically contains documents that represent an end user or similar identity. Fauna assigns roles to the impersonated token based on this collection and the role’s membership property. | | | true | Document ID for the impersonated token’s identity document. The identity document must be in the . | ### [](#impersonate-a-user-defined-role)Impersonate a user-defined role You typically use this type of scoped key to impersonate a user-defined role. #### [](#syntax-3)Syntax \[:\]:@role/ #### [](#example-3)Example ``` // Scoped key that impersonates the user-defined `customer` role // for the database scoped to the secret. fn...:@role/customer // Scoped key that impersonates the `manager` role // for the `child_db` child database. fn...:child_db:@role/manager // Scoped key that impersonates the `owner` role // for the `grand_child_db` database, nested // under `child_db`. fn...:child_db/grand_child_db:@role/owner ``` #### [](#parameters-3)Parameters | Parameter | Required | Description | | --- | --- | --- | --- | --- | | | true | Key secret. | | | | Name of a child database. To access a nested child database, separate the database names by /. For example, use test/performance to access the performance database within the test child database. | | | true | User-defined role to impersonate. If using a child database, the role must exist in the child database. | ## [](#dashboard)Dashboard-created keys The [Fauna Dashboard](https://dashboard.fauna.com/) automatically creates a temporary key when you: * Log in to the Dashboard. This key has the built-in `admin` role. * Use the Dashboard Shell’s authentication drop-down to run a query using a role other than **Admin**. ![Run a query as a role](../../_images/run-as-role.png) Dashboard-created keys have a 15-minute [`ttl` (time-to-live)](#ttl) and are scoped to their specific database. Related `Key` documents include a `data` field with related metadata: ``` { id: "414467050449141793", coll: Key, ts: Time("2099-11-13T19:17:11.020Z"), ttl: Time("2099-11-13T19:32:09.915Z"), data: { name: "System-generated dashboard key" }, role: "admin" } ``` The Dashboard surfaces this metadata in the database’s **Keys** tab on the **Explorer** page. ![Key’s tab in the Fauna Dashboard](../../_images/keys-tab.png) # Authorization This guide provides a high-level overview of authorization in Fauna. ## [](#roles)Roles Fauna uses [secrets](../authentication/#secrets) for authorization. You create a secret through [authentication](../authentication/). Each secret can have one or more [roles](../roles/). The roles determine the secret’s privileges, which control data access. Fauna ships with built-in roles. You can also create user-defined roles. User-defined roles let you grant privileges to specific resources, such as collections or user-defined functions (UDFs). | See Roles | | --- | --- | --- | ## [](#attribute-based-access-control-abac)Attribute-based access control (ABAC) Fauna supports both role-based access control (RBAC) and attribute-based access control (ABAC). In an ABAC model, you conditionally grant a user or system access to data based on attributes. For example, you can adjust access based on: * The user’s current location, status, or recent activity * An accessed document’s current status or field values * Date or time of day Unlike many systems, Fauna checks privileges and dynamically grants access at query time for every query. | See Attribute-based access control (ABAC) | | --- | --- | --- | ## [](#user-defined-functions)User-defined functions A role can grant the privilege to call a server-side [user-defined function (UDF)](../../schema/user-defined-functions/). When you define a UDF, you can specify an optional role. If provided, the UDF runs using the role’s privileges, regardless of the secret used to call it. This lets you grant access to sensitive data in a controlled, prescribed way — without granting broader privileges. You can customize the format of data returned by a UDF. This lets you mask, transform, or remove specific fields as needed. | See User-defined functions (UDFs) | | --- | --- | --- | # Roles | Reference: | FSL role schema | | --- | --- | --- | --- | Fauna uses [secrets](../authentication/#secrets) for authentication and authorization. Roles determine a secret’s privileges, which control data access. ## [](#role-types)Role types Fauna supports two types of roles: * [Built-in roles](#built-in-roles) * [User-defined roles](#user-defined-role) ### [](#built-in-roles)Built-in roles Built-in roles grant access to all of a database’s resources. Built-in roles are assigned to [keys](../keys/) and typically used for system processes. You can’t assign built-in roles to [tokens](../tokens/) or [JWTs](../access-providers/#jwt). | Built-in role | Privileges | | --- | --- | --- | --- | | admin | Full access to a database, including all:Collections, including their documents and indexesAccess providers (but not their JWTs' secrets)Child databasesKeys (but not their secrets)Schema, including .fsl schema filesTokens (but not their secrets)User-defined functions (UDFs)User-defined rolesYou can also create an admin key in a Fauna account’s top-level context. This key lets you create and manage top-level databases. | | server | Same as admin, except the server role can’t access:Access providersChild databasesKeysUser-defined roles | | server-readonly | Same as server, except the server-readonly only has read privileges. | | client | Deprecated. Do not use. | ### [](#user-defined-role)User-defined roles User-defined roles can grant: * Access to documents and [indexes](../../data-model/indexes/) in specific collections, including user-defined and system collections. * The ability to call specific [user-defined functions (UDFs)](../../schema/user-defined-functions/). You create and manage user-defined roles as FSL [role schema](../../../reference/fsl/role/): ```fsl role manager { // Assign the `manager` role to tokens with // an identity document in the `Manager` collection. // Not applicable to JWTs or keys. membership Manager // If the predicate is `true`, // assign the `manager` role to tokens with // an identity document in the `Customer` collection. membership Customer { // Check that the identity document's // `accessLevel` field value is `manager`. predicate (customer => customer.accessLevel == 'manager') } // Grant `read` access to the `Customer` collection, // including its documents and indexes. privileges Customer { read } // Grant full access to the `OrderItem` collection. privileges OrderItem { create read write delete } // If the predicate is `true`, // grant `create` access to the `Order` collection. privileges Order { create { predicate (doc => // Check the order's `status` field. doc.status == "cart" ) } } // If the predicate is `true`, // grant `read` access to the `Manager` collection. privileges Manager { read { // Check that the `Manager` document // is the token's identity document. // `Query.identity()` is `null` for JWTs or keys. predicate (doc => Query.identity() == doc) } } // If the predicate is `true`, // grant `call` access to the // user-defined `checkout` function. privileges checkout { call { // Check that the `orderId` belonds to the user. // `Query.identity()` is `null` for JWTs or keys. predicate ((orderId, status, payment) => { let order = Order.byId(orderId)! order?.customer == Query.identity() }) } } } ``` ## [](#role-collection)`Role` collection Fauna stores user-defined roles as documents in the `Role` system collection. These documents are an FQL version of the FSL role schema. You can use [Role](../../../reference/fql-api/role/) methods to access and manage user-defined roles in FQL. | See Role FQL docs | | --- | --- | --- | ## [](#role-membership)Membership Fauna assigns user-defined roles to tokens based on their identity document’s collection and the role’s `membership` property: ```fsl role customer { // Assign the `customer` role to tokens with // identity documents in the `Customer` collection. membership Customer } ``` The `membership` property doesn’t assign roles to JWTs or keys. ### [](#membership-predicates)Membership predicates Each `membership` property can include a [predicate](../../../reference/fql/functions/#predicates). Membership predicates let you conditionally assign roles to tokens: ```fsl role customer { // If the predicate is `true`, // assign the `customer` role to tokens with // identity documents in the `Customer` collection. membership Customer { // Checks the `status` field in the // token's identity document. predicate (user => user.status == "active") } } ``` You can use membership predicates to implement [attribute-based access control (ABAC)](../abac/). | See Attribute-based access control (ABAC) | | --- | --- | --- | ### [](#membership-multiple-roles)Membership for multiple roles You can use membership to assign a token up to 64 roles. For performance, Fauna stops checking a token’s roles once it verifies the token has the privileges required for a query. Fauna evaluates membership predicates sequentially, prioritizing previously successful ones. ## [](#privileges)Privileges A role can grant one or more privileges. A privilege allows a specific action, such as `read` or `write`, on a specific resource, such as a collection or UDF. Privileges act as an allowlist. Roles grant no privileges by default. ### [](#collection-privileges)Collection privileges Collection privileges grant access to a collection’s documents. A collection privilege can allow the `create`, `delete`, `read`, or `write` actions. `read` access includes the ability to [call the collection’s indexes](../../data-model/indexes/#call). An example [FSL role schema](../../../reference/fsl/role/): ```fsl role customer { // Grant read access to `Product` documents and indexes. privileges Product { read } } ``` You can also grant access to system collections that store Fauna resources: ```fsl role manager { // Grant `create` and `read` access to the `Token` system collection. // Allows the role to create token secrets. privileges Token { create read } } ``` To allow a role to create, delete, or manage user-defined collections themselves, grant access to the `Collection` system collection: ```fsl role manager { // Grant full access to the `Collection` system collection. // Allows the role to create, delete, read, and update // user-defined collections. privileges Collection { create delete read write } } ``` Built-in roles also have collection privileges. See [built-in roles](./). ### [](#fn-privileges)Function privileges Function privileges grant the ability to `call` a server-side UDF: ```fsl role customer { // Grants `call` access to the `getOrCreateCart` UDF. privileges getOrCreateCart { call } } ``` By default, UDFs run with the same privileges as the query’s authentication secret. You can override this default by specifying an optional role in the UDF’s [function schema](../../../reference/fsl/function/). If a role is specified in the function schema, the UDF runs with that role’s privileges, regardless of the authentication secret’s privileges. | See User-defined functions (UDFs) | | --- | --- | --- | ### [](#privilege-predicates)Privilege predicates Each `privilege` action can include a [predicate](../../../reference/fql/functions/#predicates). Privilege predicates let you conditionally grant privileges to a role: ```fsl role customer { // If the predicate is `true`, // grant `write` access to the `Order` collection. privileges Order { write { predicate ((oldDoc, newDoc) => // Disallow write access 10 hours after // the original order's last update (`ts`) Time.now().difference(oldDoc!.ts, "hours") < 10 && // Check that the user's `country` is // in the updated order's `allowedCountries` newDoc.allowedCountries.includes( Query.identity()!.country) ) } } } ``` You can use privilege predicates to implement [attribute-based access control (ABAC)](../abac/). | See Attribute-based access control (ABAC) | | --- | --- | --- | #### [](#privilege-predicate-arguments)Privilege predicate arguments Privilege predicates are passed different arguments based on their action. | Action | Predicate function signature | | --- | --- | --- | --- | | create | (doc: Object) => Boolean | Null doc: Object containing the document to create. Includes metadata fields. | | delete | (doc: Object) => Boolean | Null doc: Object containing the document to delete. Includes metadata fields. | | read | (doc: Object) => Boolean | Null doc: Object containing the document to read. Includes metadata fields. | | write | (oldDoc: Object, newDoc: Object) => Boolean | Null oldDoc: Object containing the original document. Includes metadata fields.newDoc: Object containing the document to write. Includes metadata fields. | | create_with_id | (doc: Object) => Boolean | Null doc: Object containing the document to create. Includes metadata fields. | | history_read | (doc: Object) => Boolean | Null doc: Object containing the document to read. Includes metadata fields. | | call | (args: Array) => Boolean | Null args: Array containing the function call’s arguments. | ## [](#query-time-evaluation)Query-time evaluation Fauna evaluates a secret’s roles and privileges, including any predicates, at query time for every query. Changes to roles and privileges take effect immediately and affect pre-existing secrets. ### [](#check-a-secrets-roles)Check a secret’s roles You can use a user-defined collection and UDFs to check a secret’s assigned roles. See [Check a secret’s user-defined roles](../../query/patterns/check-secret-roles/). ## [](#multi-tenancy-and-scope)Multi-tenancy and scope Roles are scoped to a single database. A parent database’s roles don’t apply to its peer or child databases. You can copy and deploy roles across databases using FSL and a CI/CD pipeline. See [Manage schema with a CI/CD pipeline](../../schema/manage-schema/#cicd). # Attribute-based access control (ABAC) Attribute-based access control (ABAC) is a security model that conditionally grants access to resources based on attributes. In Fauna, these attributes can relate to the: * User or system requesting access, if you’re using [tokens](../tokens/) * Resource being accessed * Operation to perform and its effect * Environment, such as date or time of day ## [](#before-you-start)Before you start Before implementing ABAC in Fauna, you should be familiar with: * [Authentication](../authentication/) * [Authorization](../authorization/) * [Roles](../roles/) ## [](#rbac-vs-abac)RBAC vs. ABAC ABAC extends traditional role-based access control (RBAC). The models are not mutually exclusive. Fauna supports both RBAC and ABAC. ### [](#rbac)RBAC In RBAC, you assign predefined roles with static privileges. If an authentication secret has a role, it’s granted all of the role’s privileges. RBAC example Users with the `customer` role have: * Access to `Order` collection documents, which contain store orders. * The ability to call the `submitOrder` user-defined function (UDF). Users with the `manager` role have: * Access to `Customer` collection documents, which contain customer profile data. * Access to `Order` collection documents. **Issues** This approach is simple, but it’s inflexible and lacks granularity: * Customers can’t access their own `Customer` documents. * Customers can access any `Order` document, including orders where they’re not the customer. * Managers can access `Customer` documents for other stores. * Managers can access `Order` documents, even if: * They’re logged in at another store. * It’s past closing time in their store’s local time. * The order has a `settled` status or it’s 7 days past the `settlementDate`. ### [](#abac)ABAC With ABAC, you can conditionally assign roles and grant privileges based on multiple attributes. ABAC example Users with the `customer` role have: * Access to their own `Customer` document but no other. * Access to `Order` documents where they’re the customer but no other. * The ability to call the `submitOrder` UDF. Users with the `manager` role have: * Access to their own store’s `Customer` documents but no other. * `Order` documents if: * The user isn’t logged in at another store. * It’s before closing time in their store’s local time. * The order doesn’t have a `settled` status. OR It’s within 7 days of the `settlementDate`. ## [](#dynamic-abac-in-fauna)Dynamic ABAC in Fauna In Fauna, you implement ABAC using role-related [predicates](../../../reference/fql/functions/#predicates). You can use the predicates to: * [Dynamically assign roles](#dynamically-assign-roles) * [Dynamically grant privileges](#dynamically-grant-privileges) Fauna evaluates and assigns a secret’s roles and privileges, including any predicates, at query time for every query. This lets you grant access based on real-time user data and the environment. ### [](#dynamically-assign-roles)Dynamically assign roles You can dynamically assign roles to: * [JWTs](#jwts) * [Tokens](#tokens) #### [](#jwts)Dynamically assign roles to JWTs When you define an [access provider schema](../../../reference/fsl/access-provider/), you can specify one or more `role` properties. Fauna assigns these roles to the provider’s JWTs. Each `role` property can include a predicate: ```fsl access provider someIssuer { ... // If the predicate is `true`, // assign the `manaager` role to the provider's JWTs. role manager { // Check that the JWT's `scope` includes `manager`. predicate (jwt => jwt!.scope.includes("manager")) } } ``` The predicate is passed one argument: an object containing the JWT’s payload. #### [](#tokens)Dynamically assign roles to tokens Fauna assigns roles to a token based on its identity document’s collection and the [role schema](../../../reference/fsl/role/)'s `membership` properties. Each `membership` property can include a [membership predicate](../roles/#membership-predicates): ```fsl role customer { // If the predicate is `true`, // assign the `customer` role to tokens with // identity documents in the `Customer` collection. membership Customer { // Checks the `accessLevel` field in the // token's identity document. predicate (idDoc => idDoc.accessLevel == "customer") } } ``` The predicate is passed one argument: an object containing the token’s identity document. ### [](#dynamically-grant-privileges)Dynamically grant privileges A [role schema](../../../reference/fsl/role/) typically includes several `privileges`. Each privilege can include a [privilege predicate](../roles/#membership-predicates): ```fsl role customer { privileges Order { // If the predicate is `true`, // grant `read` access to the `Order` collection. read { predicate (doc => // Check the order's `status` field. doc.status != "Deleted" ) } // If the predicate is `true`, // grant `write` access to the `Order` collection. write { predicate ((oldDoc, newDoc) => // Check the existing order's status. oldDoc.status != "Deleted" && // Check that `customer` isn't changed by the write. oldDoc.customer == newDoc.customer && // Check the current time. // Allow access after 07:00 (7 AM). Time.now().hour > 7 && // Disallow access after 20:00 (8 PM). Time.now().hour < 20 ) } } } ``` Privilege predicates are passed different arguments based on the action the privilege grants. See [Privilege predicate arguments](../../../reference/fsl/role/#privilege-predicate-arguments). ## [](#identity-based-attributes)Identity-based attributes [Tokens](../tokens/) are tied to an identity document. You can fetch a token’s identity document using the [`Query.identity()`](../../../reference/fql-api/query/identity/) method: ```fsl role customer { privileges Order { read { // `Query.identity()` gets the token's identity document. // The identity document typically represents a user or system. // In this example, `doc.customer` is the order's customer. // The predicate checks that the order belongs to the customer. predicate (doc => Query.identity() == doc.customer) } } } ``` Predicates can also check an identity document’s fields: ```fsl role customer { privileges Order { write { predicate ((oldDoc, newDoc) => // Check that the user's `country` is in // the updated order's `allowedCountries`. newDoc.allowedCountries.includes( Query.identity()!.country) && // Disallow `write` access 10 hours after // the last document timestamp. Time.now().difference(oldDoc!.ts, "hours") < 10 ) } } } ``` For JWTs and keys, [`Query.identity()`](../../../reference/fql-api/query/identity/) returns `null`. JWTs and keys aren’t tied to an identity document and don’t support identity-based attributes. ### [](#token-metadata)Token metadata Fauna stores tokens as documents in the `Token` system collection. This token document is distinct from the token’s identity document. A token document can include metadata in its `data` field. You later check this metadata in a predicate for ABAC. Use [`token.update()`](../../../reference/fql-api/token/update/) to add metadata to a token document: ```fql // Get an existing credential for a `Manager` collection document. let credential = Credential.byDocument(Manager.byId("111")) // Use `login()` to create a token using the credential and its password. // Returns a document in the `Token` system collection. let token = credential!.login("") // Add metadata to the token document using the `data` property. // The result doesn't include this property, but it's added. token!.update({ data: { clientIpAddr: "123.123.12.1" } }) ``` Use [`Query.token()`](../../../reference/fql-api/query/token/) to fetch the token document for the query’s authentication token. You can then access the document’s `data` field: ```fsl role manager { membership Manager { // Assign the `manager` role if // the token document's `clientIpAddr` metadata is // in the manager's `approvedIpAddresses`. predicate (_ => Query.identity()!.approvedIpAddresses .includes(Query.token()!.data!.clientIpAddr) ) } } ``` ## [](#environmental-attributes)Environmental attributes You can use predicates to assign roles and grant privileges based on the current date or time. Use [`Date.today()`](../../../reference/fql-api/date/today/) to get the current date: ```fsl role manager { membership Manager { // Assign the `manager` role only on weekdays. predicate (_ => Date.today().dayOfWeek < 6) } } ``` Use [`Time.now()`](../../../reference/fql-api/time/now/) to get the current time: ```fsl role manager { privileges Order { write { // Disallow `write` access if the user is logged in for // more than 60 minutes. predicate ((_, _) => Time.now().difference(Query.identity()!.login, "minutes") < 60 ) } } } ``` # Security best practices This guide covers best practices for [authentication](../authentication/) and [authorization](../authorization/) in Fauna. ## [](#follow-the-principle-of-least-privilege)Follow the principle of least privilege Users and systems should have the fewest privileges needed to complete their required tasks: * Only add privileges to roles that need them. * Only assign roles to users or systems that require them. * Only allow access to sensitive data through [user-defined functions (UDFs)](../../schema/user-defined-functions/). UDFs let you control how data is accessed and customize the format of returned data. ## [](#limit-the-number-of-user-defined-roles)Limit the number of user-defined roles Only create the roles you need when you need them. Fauna evaluates roles and privileges at query time. This lets you create or change roles as needed. Changes to roles and privileges take effect immediately and affect existing secrets. ## [](#limit-the-number-of-role-related-predicates)Limit the number of role-related predicates For the best performance and lower costs, only use [role-related predicates](../abac/) when needed. Role-related predicates are evaluated for every applicable query. Predicate evaluations consume Transactional Read and Transactional Compute Operations. ## [](#use-indexes-for-filtering)Use indexes for filtering Avoid using role-related predicates to filter collections or large sets of documents. Instead, use [indexes](../../data-model/indexes/). ## [](#set-an-expiration-for-secrets)Set an expiration for secrets When possible, set a `ttl` (time-to-live) timestamp for Fauna [keys](../keys/) and [tokens](../tokens/). To limit the impact of stolen credentials, use the shortest feasible `ttl` for your use case. Similarly, [JWTs](../access-providers/) created by an access provider should include the soonest `exp` timestamp possible for your use case. ## [](#use-environmental-and-identity-based-attributes-for-abac)Use environmental and identity-based attributes for ABAC Use predicates with [environmental attributes](../abac/#environmental-attributes), such as date or time, and [identity-based attributes](../abac/#identity-based-attributes) to limit access if credentials are stolen. For example, you can only grant access to users connecting from specific locations or IP addresses or during specific hours. ## [](#use-membership-predicates-for-environmental-and-identity-based-attributes)Use membership predicates for environmental and identity-based attributes If you use [tokens](../tokens/), use membership predicates rather than privilege predicates to check environmental attributes, such as date or time, or identity attributes. This avoids duplicating the predicate across multiple privileges. ## [](#update-identity-documents-in-real-time)Update identity documents in real time If you use [tokens](../tokens/), you can update identity documents in real time to dynamically control access with role-related predicates. For example, you can use a membership predicate to control access based on the `badgedIn` field in `Employee` identity documents. Fauna checks the predicate at query time for every query. ## [](#structure-membership-predicates-to-return-early)Structure membership predicates to return early If you use [membership predicates](../roles/#membership-predicates) to assign multiple roles to tokens, structure the predicates to return as early as possible. This ensures Fauna spends less time evaluating the predicate. See [Membership for multiple roles](../roles/#membership-multiple-roles). ## [](#use-privilege-predicates-to-verify-document-changes)Use privilege predicates to verify document changes Use [collection privilege](../roles/#fn-privileges) predicates to validate the input and output of document operations. For example, you can use a `write` privilege predicate to ensure users can’t read or update specific document fields. This limits the surface area for attacks. ## [](#use-privilege-predicates-to-validate-udf-arguments)Use privilege predicates to validate UDF arguments Use [function privilege](../roles/#fn-privileges) predicates to validate the arguments passed to a UDF call. For example, you can ensure users can’t call a function with data unrelated to their tasks or scope. This limits the surface area for attacks. ## [](#avoid-using-middleware)Avoid using middleware Connect your client application directly to Fauna to limit the surface area for attacks. ## [](#use-a-cicd-pipeline-to-copy-roles-across-databases)Use a CI/CD pipeline to copy roles across databases Roles are scoped to a single database and don’t apply to its peer or child databases. If you have a multi-tenant application, you can copy and deploy roles across databases using FSL and a CI/CD pipeline. See [Manage schema with a CI/CD pipeline](../../schema/manage-schema/#cicd). # Schema A schema controls a database’s structure and behavior. ## [](#fauna-schema-language)Fauna Schema Language In Fauna, you define schema using Fauna Schema Language (FSL). You use FSL to create and update schema for: * [Access providers](../security/access-providers/) for authentication * [Collections](#collection-schema), including document types * [User-defined functions (UDFs)](user-defined-functions/) * [User-defined roles](../security/roles/) Collectively, these constitute the database schema. ### [](#fsl-files)Manage schema as `.fsl` files You can create and manage schema using any of the following: * The [Fauna CLI](manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../reference/http/reference/core-api/#tag/Schema) * [FQL schema methods](manage-schema/#fql) The [Fauna CLI](manage-schema/#staged) and the [Schema endpoints](../../reference/http/reference/core-api/#tag/Schema) let you manage schema as `.fsl` files. Using `.fsl` files lets you: * Store `.fsl` schema files alongside your application code * Pull and push schema to your Fauna database from a local directory * Place database schema under version control * Deploy schema with [CI/CD pipelines](manage-schema/#cicd) * Change your production schema as your app evolves using [progressive schema enforcement](#type-enforcement) and [zero-downtime migrations](#schema-migrations) For more information, see [Manage schema as `.fsl` files](manage-schema/). ## [](#fql)FQL schema methods Fauna stores each schema as an FQL document in a related [system collection](../data-model/collections/#system-coll). You can use methods for these system collections to programmatically create and manage schema using FQL queries. | FSL schema | FQL system collection | | --- | --- | --- | --- | | Access provider schema | AccessProvider collection | | Collection schema | Collection collection | | Function schema | Function collection | | Role schema | Role collection | ## [](#collection-schema)Collection schema | Reference: FSL collection schema | | --- | --- | --- | A collection schema defines the structure and behavior of a collection and its documents. It can include: * A [document type definition](#document-type-definitions) that controls what fields are accepted in a collection’s documents. The document type definition consists of: * [Field definitions](#field-definitions)that define document fields * A [wildcard constraint](#wildcard-constraint) that allows or disallows arbitrary ad hoc fields in documents * [Computed field definitions](#computed-fields) * A [migrations block](#migrations-block) for handling changes to the document type * [Index definitions](#index-definitions) for efficient querying * [Unique constraints](#unique-constraints) to ensure fields contain unique values * [Check constraints](#check-constraints) for data validation * [Document time-to-live (TTL) settings](#document-ttl) * [Document history settings](#document-history) You create and manage collection schema in FSL: ```fsl collection Product { // Field definitions. // Define the structure of the collection's documents. name: String? description: String? price: Int = 0 stock: Int = 0 creationTime: Time = Time.now() creationTimeEpoch: Int? typeConflicts: { *: Any }? // Wildcard constraint. // Allows or disallows arbitrary ad hoc fields. *: Any // Migrations block. // Used for schema migrations. // Instructs Fauna how to handle updates to a collection's // field definitions and wildcard constraint. // Contains imperative migration statements. migrations { add .typeConflicts add .stock add_wildcard backfill .stock = 0 drop .internalDesc move_conflicts .typeConflicts move .desc -> .description split .creationTime -> .creationTime, .creationTimeEpoch } // Index definition. // You use indexes to filter and sort documents // in a performant way. index byName { terms [.name] values [desc(.stock), desc(mva(.categories))] } // Unique constraint. // Ensures a field value or combination of field values // is unique for each document in the collection. // Supports multivalue attribute (`mva`) fields, such as Arrays. unique [.name, .description, mva(.categories)] // Check constraint. // Ensures a field value meets provided criteria // before writes. Written as FQL predicate functions. check posStock ((doc) => doc.stock >= 0) // Computed field. // A document field that derives its value from a // user-defined, read-only FQL function that runs on every read. compute InventoryValue: Number = (.stock * .price) // Controls whether you can write to the `ttl` field for collection // documents. If the collection schema doesn't contain field // definitions, `document_ttls` defaults to `true`. Otherwise, // `document_ttls` defaults to `false`. document_ttls true // Sets the default `ttl` for documents in days from their creation // timestamp. You can override the default ttl` during document // creation. ttl_days 5 // Controls document history retention. history_days 3 } ``` ## [](#document-type-definitions)Document type definitions A collection’s schema can include a document type definition. The definition controls what fields are accepted in a collection’s documents. You define a document’s type using: * [Field definitions](#field-definitions) * A [wildcard constraint](#wildcard-constraint) * [Computed field definitions](#computed-fields) ### [](#field-definitions)Field definitions | Reference: FSL collection schema: Field definitions | | --- | --- | --- | Field definitions define fields for a collection’s documents. A field definition consists of: * A field name * Accepted data types for the field’s values * An optional default value You can use field definitions to: * Ensure each document in a collection contains a specific field * Limit a field’s values to specific types * Set a default value for documents missing a field * Enumerate accepted values ```fsl collection Product { // `name` is optional (nullable). // Accepts `String` or `null` values. name: String? // Equivalent to `name: String | Null` // `price` is optional (nullable). // Accepts `Int` or `null` values. price: Int? // `stock` is non-nullable. // Accepts only `Int` values. // If missing, defaults to `0`. stock: Int = 0 // `creationTime` is non-nullable. // Accepts only `Time` or `Number` values. // If missing, defaults to the current time. creationTime: Time | Number = Time.now() // `category` is non-nullable. // Accepts only the enumerated "grocery", // "pharmacy", or "home goods" values. // If missing, defaults to "grocery". category: "grocery" | "pharmacy" | "home goods" = "grocery" } ``` ### [](#wildcard-constraint)Wildcard constraint | Reference: Wildcard constraints | | --- | --- | --- | An ad hoc field is an arbitrary document field that doesn’t have a field definition. You can use a collection schema’s [wildcard constraint](../../reference/fsl/field-definitions/#wildcard-constraints) to allow or disallow ad hoc fields in the collection’s documents. ```fsl collection Product { name: String? // Equivalent to `name: String | Null` ... // Wildcard constraint. // This example accepts ad hoc fields of any type. *: Any } ``` ### [](#computed-fields)Computed fields | Reference: FSL collection schema: Computed field definitions | | --- | --- | --- | Computed fields derive their field value from a provided function. They let you create new fields based on existing data or calculations. You can use a computed field to: * Combine or transform other field values * Assign a value based on an `if ... else` expression * Assign a value based on one or more ranges Computed fields aren’t part of the original document or persistently stored. Instead, the field’s value is computed on each read. ## [](#type-enforcement)Document type enforcement Fauna rejects attempts to write documents that don’t conform to a collection’s field definitions and wildcard constraint. You can use the collection’s field definitions and wildcard constraint to adjust how strictly you enforce a predefined structure on collection documents: | Strategy | Description | Field definitions | Wildcard constraint | | --- | --- | --- | --- | --- | --- | | Schemaless | Accepts ad hoc fields of any type. No fields are predefined. | No field definitions | No wildcard constraintORA wildcard constraint of*: Any | | Permissive | Accepts ad hoc fields and predefined fields. Fields must conform to the structure of their definitions. | One or more field definitions | A wildcard constraint | | Strict | Only accepts predefined fields | One or more field definitions | No wildcard constraint | ### [](#schemaless-by-default)Schemaless by default If a collection has no field definitions, it’s schemaless by default. It implicitly accepts ad hoc fields of any type. ### [](#progressive-enforcement)Progressive enforcement Using permissive document types is often helpful earlier in an application’s development. Allowing ad hoc fields lets you add fields as needed. As your data evolves, you can use [zero-downtime migrations](#schema-migrations) to add field definitions for ad hoc fields and normalize field values. This lets you move from a permissive document type to strict one (or the reverse). | Tutorial: Progressively enforce a document type | | --- | --- | --- | ## [](#schema-migrations)Zero-downtime schema migrations A schema migration is an update to a collection schema’s field definitions or wildcard constraint. Schema migrations require no downtime or locks on your database. ### [](#migrations-block)Migrations block | Reference: FSL collection schema: Migrations block | | --- | --- | --- | To handle migrations, you include a [migrations block](../../reference/fsl/migrations-block/) in the collection schema. The block contains one or more imperative migration statements. The statements instruct Fauna on how to migrate from the collection’s current field definitions and wildcard constraint to the new ones. ```fsl collection Product { ... *: Any migrations { // Applied 2099-05-06 add .typeConflicts add .stock move_conflicts .typeConflicts backfill .stock = 0 drop .internalDesc move .desc -> .description split .creationTime -> .creationTime, .creationTimeEpoch // Applied 2099-05-20 // Make `price` a required field. split .price -> .price, .tempPrice drop .tempPrice backfill .price = 1 } } ``` ### [](#run-a-schema-migration)Run a schema migration A typical schema migration involves the following steps: 1. Update the [field definitions](../../reference/fsl/field-definitions/) and [wildcard constraint](../../reference/fsl/field-definitions/#wildcard-constraints) in the [collection schema](../../reference/fsl/collection/). 2. Add one or more related migration statements to the collection schema’s migrations block. Include comments to group and annotate statements related to the same migration. Fauna runs each new migration statement sequentially from top to bottom. Fauna ignores unchanged migration statements from previous migrations. 3. Commit the updated collection schema to Fauna with a [staged schema change](manage-schema/#staged). To run a staged schema change using the CLI: 1. Use [`fauna schema push`](../../build/cli/v4/commands/schema/push/) to stage the schema changes. [`fauna schema push`](../../build/cli/v4/commands/schema/push/) stages schema changes by default: ```cli # Replace 'us' with your preferred region group: # 'us' (United States), 'eu' (Europe), or `global`. # Replace 'my_db' with your database's name. fauna schema push \ --database us/my_db \ --dir /path/to/schema/dir ``` A database can have one staged schema change at a time. You can update staged schema using [`fauna schema push`](../../build/cli/v4/commands/schema/push/). When a database has staged schema, any access or updates done using FQL’s schema commands on related [system collections](../data-model/collections/#system-coll) interact with the staged schema, not the database’s active schema. For example, when schema changes are staged, [`Collection.all()`](../../reference/fql-api/collection/static-all/) returns `Collection` documents for the staged collection schema, not the database’s `Collection` documents. If a database has staged schema, you can’t edit the database’s active schema using FQL, the [Dashboard](https://dashboard.fauna.com/), or an [unstaged schema change](manage-schema/#unstaged). You must first [abandon the staged schema change](#abandon). 2. Use [`fauna schema status`](../../build/cli/v4/commands/schema/status/) to check the status of the staged schema: ```cli fauna schema status \ --database us/my_db ``` Possible statuses: | Staged status | Description | | --- | --- | --- | --- | | pending | Changes are being processed. New indexes are still being built. | | ready | All indexes have been built. Changes are ready to commit. | | failed | There was an error during the staging process. | 3. When the status is `ready`, use [`fauna schema commit`](../../build/cli/v4/commands/schema/commit/) to apply the staged schema to the database: ```cli fauna schema commit \ --database us/my_db ``` You can only commit staged schema with a status of `ready`. If you no longer wish to apply the staged schema or if the status is `failed`, use [`fauna schema abandon`](../../build/cli/v4/commands/schema/abandon/) to unstage the schema: ```cli fauna schema abandon \ --database us/my_db ``` Once committed, changes from the migration are immediately visible in any subsequent queries. ### [](#migration-errors)Migration errors When you submit a collection schema, Fauna checks the schema’s field definitions and migration statements for potential conflicts. If a change could conflict with the collection’s data, Fauna rejects the schema with an error message. The check doesn’t require a read or scan of the collection’s documents. ## [](#index-definitions)Index definitions An index stores, or covers, specific document field values for quick retrieval. Using indexes can significantly improve query performance and reduce costs, especially for large datasets. | See Indexes | | --- | --- | --- | ## [](#unique-constraints)Unique constraints | Reference: FSL collection schema: Unique constraint definitions | | --- | --- | --- | Unique constraints ensure a field value or combination of field values is unique for each document in a collection. Fauna rejects document writes that don’t meet the constraint. ## [](#check-constraints)Check constraints | Reference: FSL collection schema: Check constraint definitions | | --- | --- | --- | Check constraints ensure field values meet a pre-defined rule. For example, you can check that field values are in an allowed range. You define a check constraint as a read-only FQL [predicate](../../reference/fql/functions/#predicates). Fauna only allows document writes if the predicate evaluates to `true`. ## [](#document-ttl)Document time-to-live (TTL) A document can include an optional `ttl` (time-to-live) field that contains the document’s expiration timestamp. After the `ttl` timestamp passes, Fauna permanently deletes the document. You can use a collection schema’s `ttl_days` field to set a default `ttl` for collection documents. See [Set a default ttl](../doc-ttl/#set-default-ttl). You can use a collection schema’s `document_ttls` field to control whether you can write to the `ttl` field for collection documents. See [Enable or disable ttl writes](../doc-ttl/#enable-disable-ttl-writes). | See Document time-to-live (TTL) | | --- | --- | --- | ## [](#document-history)Document history Fauna stores snapshots of each document’s history. Fauna creates these snapshots each time the document receives a write. You can use a collection schema’s `history_days` field to set how many days of document history Fauna retains for a collection’s documents. | See Document history | | --- | --- | --- | ## [](#protected-mode)Protected mode Protected mode is a database setting that prohibits destructive operations on a database’s collections. When you create a database in the [Fauna Dashboard](https://dashboard.fauna.com/), you select one of the following **Protected Mode** options: | Option | Description | | --- | --- | --- | --- | | Disabled | Default. No operations are prohibited. | | Enabled | Prohibits destructive operations. See Prohibited operations. | | Inherit | Sets the database’s Protected Mode setting to the nearest ancestor’s Protected Mode setting that is not Inherit. This option is only available for child databases. | ### [](#prohibited-operations)Prohibited operations When **Protected Mode** is **Enabled**, the following operations are prohibited. | Resource or field | Prohibited operations | Exceptions | | --- | --- | --- | --- | --- | | Collection | Delete | | | Index definitions | ChangeRemove | Changes that don’t result in the deletion of a backing index are allowed in an unstaged schema change.Any change or removal is allowed in a staged schema change. | | history_days | DecreaseRemove | | | ttl_days | DecreaseAdd | | | Unique constraints | ChangeRemove | | ## [](#validation)Database schema validation When you write to a schema, Fauna parses and validates the entire database schema in a single transaction. Concurrent schema writes in the same database can cause contended transactions, even if the changes affect different resources. To avoid errors, perform schema changes sequentially instead. ## [](#version)Schema version Fauna maintains a `schema_version` for the database schema that’s returned in Query HTTP API responses. Fauna increments this version when you write to any schema for the database. The schema number acts as a comparative value to help clients determine the minimum schema version used for query execution. ### [](#considerations)Considerations Keep the following in mind when working with the `schema_version`: * The `schema_version` is cached and may change without schema modifications. * The `schema_version` is not permanently persisted in Fauna. * You should only use the `schema_version` to verify that a query request ran against a specific minimum schema version. Do not rely on `schema_version` being consistent across requests. ### [](#client-drivers)Client drivers Fauna’s client drivers include the `schema_version` values in a query info or query response class. This class is used for both successful query responses and errors: * JavaScript driver: [`QueryInfo`](https://fauna.github.io/fauna-js/latest/types/QueryInfo.html) * Python driver: [`QueryInfo`](https://fauna.github.io/fauna-python/latest/api/fauna/encoding/wire_protocol.html#QueryInfo) * Go driver: [`QueryInfo`](https://pkg.go.dev/github.com/fauna/fauna-go/v2#QueryInfo) * .NET/C# driver: [`QueryResponse`](https://fauna.github.io/fauna-dotnet/latest/class_fauna_1_1_core_1_1_query_response.html) * JVM driver: [`QueryResponse`](https://fauna.github.io/fauna-jvm/latest/com/fauna/response/QueryResponse.html) # User-defined functions (UDFs) A user-defined function (UDF) is a set of one or more FQL statements stored as a reusable resource in a Fauna database. Like a stored procedure in SQL, a UDF can accept parameters, perform operations, and return results. You can use UDFs to encapsulate business logic, making it easier to manage complex operations within your database. ## [](#define-a-udf)Define a UDF You create and manage a UDF as an FSL [function schema](../../../reference/fsl/function/): ```fsl function getOrCreateCart(id) { // Find the customer by ID, using the ! operator to // assert that the customer exists. // If the customer does not exist, fauna will throw a // document_not_found error. let customer = Customer.byId(id)! if (customer!.cart == null) { // Create a cart if the customer does not have one. Order.create({ status: 'cart', customer: Customer.byId(id), createdAt: Time.now(), payment: {} }) } else { // Return the cart if it already exists. customer!.cart } } ``` You can create and manage schema using any of the following: * The [Fauna CLI](../manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../reference/http/reference/core-api/#tag/Schema) * [FQL schema methods](../manage-schema/#fql) | Reference: | FSL function schema | | --- | --- | --- | --- | ### [](#function-collection)`Function` collection Fauna stores UDFs as documents in the `Function` system collection. These documents have the [FunctionDef](../../../reference/fql/types/#functiondef) type and are an FQL version of the FSL function schema. You can use [Function](../../../reference/fql-api/function/) methods to access and manage UDFs in FQL. | See Function FQL docs | | --- | --- | --- | ## [](#call-a-udf)Call a UDF Once saved in a database, you can call the UDF in FQL queries against the database: ```fql // Call the `getOrCreateCart()` UDF with a // customer id of `111`. getOrCreateCart(111) ``` UDFs run within the context of a single query and can be combined with other FQL expressions: ```fql let customerId = "111" // Call the `getOrCreateCart()` UDF. getOrCreateCart(customerId) // Then call the `createOrUpdateCartItem()` UDF. createOrUpdateCartItem(customerId, "pizza", 1) ``` ## [](#udf-features)UDF features UDFs include features that let you create complex queries and workflows. ### [](#type-checking)Type checking You can explicitly type a UDF’s arguments and return value: ```fsl // The `x` argument must be a `Number`. // The function returns a `Number` value. function myFunction(x: Number): Number { x + 2 } ``` ### [](#multiple-statements)Multiple statements UDFs can contain multiple statements and expressions: ```fsl function calculateOrderTotal(order) { // Calculate the subtotal by summing up the prices of all items. let subtotal = order.items.fold(0, (sum, orderItem) => { if (orderItem.product != null) { sum + orderItem.product.price * orderItem.quantity } else { sum } }) // Calculate the tax based on the subtotal. let tax = subtotal * 0.1 // Return the final total including the tax. subtotal + tax } ``` Like an FQL query, a UDF returns the value of the last evaluated expression. ### [](#variadic-arguments)Variadic arguments Use the `...` syntax to create a variadic UDF that accepts an indefinite number of arguments, including zero. ```fsl // The `args` argument accepts multiple Numbers. function getLength(...args: Number): Number { args.length } ``` When called in an FQL query: ```fql getLength(1, 2, 3) ``` ``` 3 ``` A UDF can only accept one variadic argument. It must be the last argument. Variadic arguments are collected into an [Array](../../../reference/fql/types/#array). You can define a type signature to limit the types of values accepted and held in the Array. For example, the following UDF accepts a single [String](../../../reference/fql/types/#string) argument followed by a variadic argument of zero or more [Number](../../../reference/fql/types/#number)s: ```fsl function formatCurrency(symbol: String, ...amounts: Number): String { symbol + amounts.reduce((prev, cur) => prev + cur).toString() } ``` When called in an FQL query: ```fql formatCurrency("$", 2, 3) ``` ``` "$5" ``` ### [](#composability)Composability UDFs are composable, allowing you to combine multiple UDFs. For example, you can define a UDF: ```fsl // Defines the `applyDiscount()` UDF. function applyDiscount(total, discountPercent) { total * (1 - discountPercent / 100) } ``` And call the UDF in another UDF definition: ```fsl // Defines the `calculateFinalPrice()` UDF. function calculateFinalPrice(order, discountPercent) { // Calls the `calculateOrderTotal()` UDF. let total = calculateOrderTotal(order) // Calls the `applyDiscount()` UDF. applyDiscount(total, discountPercent) } ``` ### [](#error-handling)Error handling Use [`abort()`](../../../reference/fql-api/globals/abort/) to raise an [abort error](../../../reference/http/reference/errors/) from a UDF: ```fsl function validateOrderStatusTransition(oldStatus, newStatus) { if (oldStatus == "cart" && newStatus != "processing") { // The order can only transition from cart to processing. abort("Invalid status transition.") } else if (oldStatus == "processing" && newStatus != "shipped") { // The order can only transition from processing to shipped. abort("Invalid status transition.") } else if (oldStatus == "shipped" && newStatus != "delivered") { // The order can only transition from shipped to delivered. abort("Invalid status transition.") } } ``` ### [](#multi-tenancy-and-scope)Multi-tenancy and scope UDFs are scoped to a single database. A child database can’t access its parent database’s UDFs. You can copy and deploy UDFs across databases using FSL and a CI/CD pipeline. See [Manage schema with a CI/CD pipeline](../manage-schema/#cicd). ## [](#security-and-privileges)Security and privileges You can use UDFs to control how systems and end users access sensitive data. ### [](#udf-privileges)UDF privileges A [user-defined role](../../security/roles/) can grant the privilege to call a UDF. An example [FSL role schema](../../../reference/fsl/role/): ```fsl role customer { // Grants `call` access to the `getOrCreateCart` UDF. privileges getOrCreateCart { call } } ``` The built-in `admin` and `server` roles have privileges to call any UDF. The built-in `server-readonly` role can call UDFs that only perform read operations. | See Function privileges | | --- | --- | --- | ### [](#runtime-privileges)Runtime privileges By default, UDFs run with the privileges of the calling query’s authentication secret. When you define a UDF, you can include an optional `@role` annotation. If provided, the UDF runs using the role’s privileges, regardless of the secret used to call it: ```fsl // Runs with the built-in `server` role's privileges. @role("server") function inventory(name) { Product.byName(name) { name, description, stock } } ``` #### [](#udf-doc-refs)Resolve document references returned by UDFs If a UDF returns [document references](../../data-model/relationships/) in the query results, the [secret](../../security/authentication/) used to run the query must have `read` privileges for the referenced document’s collection. This requirement applies even if the UDF’s `@role` has `read` privileges for the collection. For example, the following UDF returns a Set of documents. Each document contains a document reference: ```fsl // Runs with the built-in `server-readonly` role's privileges. // The role has `read` privileges for the `Category` collection. @role("server-readonly") function getCategory(name) { // Returns the `category` field, which contains // a reference to a `Category` collection document. Product.byName(name) { name, category } } ``` If you call the UDF using a secret that lacks `read` privileges for the referenced document’s collection, the reference is not resolved in the results: ```fql getCategory("limes") ``` ``` { data: [ { name: "limes", category: Category("789") /* permission denied */ } ] } ``` Although the UDF’s `@role` has the required privileges, document references in Sets and documents are lazily loaded. The references are resolved, or materialized, only when results are returned — after the UDF runs. To solve this issue without granting additional privileges, update the UDF to: * [Project](../../../reference/fql/projection/) or [map](../../../reference/fql-api/set/map/#project) any desired fields from the referenced document. * Convert the results to an eager-loading type, such as an [Array](../../../reference/fql/types/#array) or [Object](../../../reference/fql/types/#object). ##### [](#return-a-set-as-an-array)Return a Set as an array If the UDF originally returned a Set of documents, update it to return the Set as an array: ```fsl @role("server-readonly") function getCategory(name) { // Project any desired fields from the referenced // `Category` document. let products = Product.byName(name) { name, category { id, ts, name, description } } // Convert the Set to an array. products.toArray() } ``` When called using a secret that lacks privileges on the referenced documents' collection: ```fql getCategory("limes") ``` ``` [ { name: "limes", category: { id: "789", ts: Time("2099-12-12T14:22:31.560Z"), name: "produce", description: "Fresh Produce" } } ] ``` ##### [](#return-a-document-as-an-object)Return a document as an object If the UDF originally returned a single document, update it to return the document as an object instead: ```fsl @role("server-readonly") function getCategory(name) { // Project any desired fields from the referenced // `Category` document. let product = Product.byName(name).first() { name, category { id, ts, name, description } } // Convert the Document to an object. Object.assign({}, product) } ``` When called using a secret that lacks privileges on the referenced document’s collection: ```fql getCategory("limes") ``` ``` { name: "limes", category: { id: "789", ts: Time("2099-12-13T16:25:53Z"), name: "produce", description: "Fresh Produce" } } ``` ### [](#control-access-with-udfs)Control access with UDFs A common pattern is to allow access to sensitive data through a UDF. The pattern lets you control how the data is accessed without granting broader privileges. For more control, you can customize the format of data returned by a UDF. This lets you mask, transform, or remove specific fields as needed. | Tutorial: Control access with ABAC | | --- | --- | --- | ## [](#limits)Limits UDFs calls are subject to the same [global limits](../../../reference/requirements-limits/#glimits) as FQL queries. # Manage schema as `.fsl` files using the Fauna CLI A [schema](../) controls a database’s structure and behavior. In Fauna, you define schema the using [Fauna Schema Language (FSL)](../../../reference/fsl/). You can manage schemas as `.fsl` files using the [Fauna CLI](../../../build/cli/v4/). This page covers how to manage `.fsl` files using the CLI. Using `.fsl` files lets you version control, review, and automate schema changes. ## [](#set-up-a-schema-directory)Set up a schema directory You typically store `.fsl` files in a schema directory alongside your application code. The directory can use any name. `schema` is a common choice. For more information about setting up a project using the CLI, see [Set up a project using FSL and the Fauna CLI](../../../build/tutorials/project/). ### [](#create-fsl-schema-files)Create FSL schema files An `.fsl` file can contain schema for multiple resources. You can use multiple `.fsl` files to organize your schema. While subject to [limits](#limits), there is no performance benefit to splitting `.fsl` files or storing larger, individual files. Valid FSL filenames must use the `.fsl` extension and can’t start with `*`. #### [](#fsl-schema-syntax)FSL schema syntax The following table links to FSL syntax to use when creating `.fsl` files: | Resource | FSL Syntax | | --- | --- | --- | --- | | Access providers | Access provider schema | | Collections, including:Check constraintsComputed field definitionsField definitionsIndex definitionsMigrationsUnique constraints | Collection schema | | User-defined functions (UDFs) | Function schema | | User-defined roles | Role schema | #### [](#common-conventions)Common conventions A common convention is to organize your `.fsl` files by resource type to maintain a clear and maintainable schema structure: ``` schema/ ├── collections.fsl # Collection schema ├── functions.fsl # User-defined functions ├── roles.fsl # Role definitions └── access-providers.fsl # Access provider definitions ``` When you push `.fsl` files to Fauna, Fauna retains the filenames and directory organization. This lets you later pull the same files. #### [](#filenames-for-generated-fsl-schema)Filenames for generated FSL schema Schema created using [FQL schema methods](#fql) or the [Fauna Dashboard](https://dashboard.fauna.com/) are stored in the `main.fsl` file. After pulling the file locally, you can reorganize the schema into separate files as desired. #### [](#limits)Limits A database can have up to 1,024 `.fsl` files, including `main.fsl`. This limit does not include `.fsl` files for child databases. ## [](#push)Push schema to Fauna A project directory includes `.fsl` files for the project’s database. You can use the Fauna CLI to push a project’s `.fsl` files to Fauna. ### [](#staged)Run a staged schema change A staged schema change lets you change one or more [collection schema](../../../reference/fsl/collection/) without index downtime due to [index builds](../../data-model/indexes/#builds). To run a staged schema change, you must use the [Fauna CLI](../../../build/cli/) or the Fauna Core HTTP API’s [Schema endpoints](../../../reference/http/reference/core-api/#tag/Schema). You can’t run a staged schema change using [FQL schema methods](#fql) or the [Fauna Dashboard](https://dashboard.fauna.com/). To run a staged schema change using the Fauna CLI: 1. Make the desired changes to `.fsl` files in your schema directory. 2. Use [`fauna schema push`](../../../build/cli/v4/commands/schema/push/) to stage the schema changes. [`fauna schema push`](../../../build/cli/v4/commands/schema/push/) stages schema changes by default: ```cli # Replace 'us' with your preferred region group: # 'us' (United States), 'eu' (Europe), or `global`. # Replace 'my_db' with your database's name. fauna schema push \ --database us/my_db \ --dir /path/to/schema/dir ``` A database can have one staged schema change at a time. You can update staged schema using [`fauna schema push`](../../../build/cli/v4/commands/schema/push/). When a database has staged schema, any access or updates done using FQL’s schema commands on related [system collections](../../data-model/collections/#system-coll) interact with the staged schema, not the database’s active schema. For example, when schema changes are staged, [`Collection.all()`](../../../reference/fql-api/collection/static-all/) returns `Collection` documents for the staged collection schema, not the database’s `Collection` documents. If a database has staged schema, you can’t edit the database’s active schema using FQL, the [Dashboard](https://dashboard.fauna.com/), or an [unstaged schema change](#unstaged). You must first [abandon the staged schema change](#abandon). 3. Use [`fauna schema status`](../../../build/cli/v4/commands/schema/status/) to check the status of the staged schema: ```cli fauna schema status \ --database us/my_db ``` Possible statuses: | Staged status | Description | | --- | --- | --- | --- | | pending | Changes are being processed. New indexes are still being built. | | ready | All indexes have been built. Changes are ready to commit. | | failed | There was an error during the staging process. | 4. When the status is `ready`, use [`fauna schema commit`](../../../build/cli/v4/commands/schema/commit/) to apply the staged schema to the database: ```cli fauna schema commit \ --database us/my_db ``` You can only commit staged schema with a status of `ready`. If you no longer wish to apply the staged schema or if the status is `failed`, use [`fauna schema abandon`](../../../build/cli/v4/commands/schema/abandon/) to unstage the schema: ```cli fauna schema abandon \ --database us/my_db ``` ### [](#unstaged)Run an unstaged schema change To immediately commit schema changes without staging, use the `--active` option: ```cli fauna schema push \ --database us/my_db \ --active ``` Schema changes that trigger an [index build](../../data-model/indexes/#builds) may result in downtime where the index is not queryable. You can’t run an unstaged schema change if a database has staged schema. You must [abandon the staged schema changes](../../../build/cli/v4/commands/schema/abandon/) first. ### [](#create-and-delete-schema)Create and delete schema Committing a schema to Fauna creates the related resource. For example, committing a collection schema to Fauna creates the collection. Similarly, you can delete the resource by removing the schema from the FSL files in a project’s schema directory then pushing the changes to Fauna. ## [](#compare-schema)Compare local, staged, and active schema Use [`fauna schema diff`](../../../build/cli/v4/commands/schema/diff/) to compare a project’s local, staged, and active schema. For example, to compare the local schema to the remote staged schema: ```cli fauna schema diff \ --database us/my_db ``` To compare the local schema to the remote active schema: ```cli fauna schema diff \ --database us/my_db \ --active ``` To compare the remote active schema to the remote staged schema: ```cli fauna schema diff \ --database us/my_db \ --staged ``` By default, [`fauna schema diff`](../../../build/cli/v4/commands/schema/diff/) prints a semantic summary of the changes. To get a textual diff, use the `--text` option. For example: ```cli fauna schema diff \ --database us/my_db \ --text ``` ## [](#pull-schema)Pull remote schema from Fauna Use [`fauna schema pull`](../../../build/cli/v4/commands/schema/pull/) to pull a database’s remote `.fsl` schema files into a project’s local schema directory: ```cli fauna schema pull \ --database us/my_db ``` If the database has [staged schema](#staged), the command pulls the database’s remote staged schema by default. If the database has no staged schema, the command pulls the database’s remote active schema. ### [](#active)Pull active schema files Use the [`fauna schema pull`](../../../build/cli/v4/commands/schema/pull/) command’s `--active` option to pull the database’s remote active schema regardless of whether the database has [staged schema](#staged): ```cli fauna schema pull \ --database us/my_db \ --active ``` ### [](#delete-local)Delete local files The [`fauna schema pull`](../../../build/cli/v4/commands/schema/pull/) command overwrites existing schema files in the local directory. If wanted, you can use the `--delete` option to delete local `.fsl` files that aren’t part of the remote schema: ```cli fauna schema pull \ --database us/my_db \ --delete ``` ## [](#cicd)Manage schema with a CI/CD pipeline You can use [schema-related Fauna CLI commands](../../../build/cli/v4/commands/schema/) to manage schema as `.fsl` files. The following examples show how you can use a CI/CD pipeline to: * [Test schema changes in pull requests](#cicd-test) * [Automatically stage and deploy schema changes](#cicd-stage) ### [](#cicd-test)Test schema changes in pull requests The following GitHub workflow shows how to use the Fauna CLI to start a local [Fauna container](../../../build/tools/docker/) and test a pull request’s database schema changes. The workflow ensures that the schema changes are: * Compatible with your project’s tests * Valid and don’t return [migration errors](../#migration-errors) that would conflict with the database’s data. In your project’s `.github/workflows/` directory, create a `.yml` file with the following contents: ```yaml name: Validate database schema changes - Pull request on: pull_request: paths: # Set this to the path to your schema directory - 'schema/**' jobs: schema_change_valid: name: Schema Change Valid runs-on: ubuntu-latest strategy: matrix: node-version: [20.x] steps: - uses: actions/checkout@v4 - name: Use Node.js ${{ matrix.node-version }} uses: actions/setup-node@v4 with: node-version: ${{ matrix.node-version }} cache: 'npm' - name: Install Fauna CLI run: npm install -g fauna-shell - name: Test PR schema changes run: | # Check out the base branch that the pull request targets. git fetch origin ${{ github.base_ref }}:${{ github.base_ref }} git checkout ${{ github.base_ref }} # Start a local Fauna container and create the 'my_db' # database. Apply the base branch's schema to the 'my_db' # database's active schema. fauna local \ --dir ./path/to/schema/dir \ --database my_db \ --typechecked # Run a query to seed collection documents in the 'my_db' # database. Collection documents are required to test # migration blocks in collection schema. If a collection # has never contained documents, the collection schema's # migration block is ignored, if present. This ensure # the migration block is validated. fauna query \ --local \ --database my_db \ --input ./path/to/seed-query.fql # Check out the PR branch. git fetch origin ${{ github.head_ref }}:${{ github.head_ref }} git checkout ${{ github.head_ref }} # Push the PR's schema changes to the local 'my_db' database. # Fails if the schema changes aren't valid. fauna schema push --dir ./path/to/schema/dir \ --local \ --database my_db \ --active \ --no-input # Run tests in the local 'my_db' database. # Fails if the schema changes are incompatible. # The command(s) may differ based on your project's setup. # This example uses 'npm test'. echo "Running tests ..." npm test ``` ### [](#cicd-stage)Automatically stage and deploy schema changes The following examples show how you can use the Fauna CLI to automatically [stage and deploy](#staged) `.fsl` files in a schema directory: * [GitHub](#github) * [GitLab](#gitlab) #### [](#github)GitHub In your project’s `.github/workflows/` directory, create a `.yml` file with the following contents: ```yaml name: Main CI on: push: branches: [ main ] jobs: ci: runs-on: ubuntu-latest env: FAUNA_SECRET: ${{ secrets.FAUNA_SECRET }} strategy: matrix: node-version: [20.x] steps: - uses: actions/checkout@v4 - name: Use Node.js ${{ matrix.node-version }} uses: actions/setup-node@v4 with: node-version: ${{ matrix.node-version }} cache: 'npm' - name: Install Fauna CLI run: npm install -g fauna-shell - name: Stage schema run: fauna schema push \ --dir ./path/to/schema/dir \ --secret $FAUNA_SECRET \ --no-input - name: Check schema status id: schema-check run: | attempts=0 max_attempts=60 # 30 minutes with 30-second intervals while [ $attempts -lt $max_attempts ]; do STATUS=$(fauna schema status --secret $FAUNA_SECRET | grep -oP '(?<=Staged changes: )\w+') echo "Current status: $STATUS" if [ "$STATUS" = "ready" ]; then echo "Schema is ready" echo "schema_status=ready" >> $GITHUB_OUTPUT exit 0 elif [ "$STATUS" = "failed" ]; then echo "Schema staging failed" echo "schema_status=failed" >> $GITHUB_OUTPUT exit 1 fi echo "Waiting for schema to be ready..." sleep 30 attempts=$((attempts + 1)) done echo "Timeout waiting for schema status" echo "schema_status=timeout" >> $GITHUB_OUTPUT exit 1 - name: Commit or abandon schema if: always() run: | if [[ "${{ steps.schema-check.outputs.schema_status }}" == "ready" ]]; then fauna schema commit --secret $FAUNA_SECRET --no-input else fauna schema abandon --secret $FAUNA_SECRET --no-input fi ``` #### [](#gitlab)GitLab In your project, create a `.gitlab-ci.yml` file with the following contents: ```yaml stages: - stage_schema - commit_or_abandon_schema variables: FAUNA_SECRET: $FAUNA_SECRET default: image: node:20 stage_schema: stage: stage_schema script: - npm install -g fauna-shell - fauna schema push --secret $FAUNA_SECRET --no-input --dir ./path/to-schema/dir - | attempts=0 max_attempts=60 while [ $attempts -lt $max_attempts ]; do STATUS=$(fauna schema status --secret $FAUNA_SECRET | grep -oP '(?<=Staged changes: )\w+') echo "Current status: $STATUS" if [ "$STATUS" = "ready" ]; then echo "Schema is ready" exit 0 elif [ "$STATUS" = "failed" ]; then echo "Schema staging failed" exit 1 fi echo "Waiting for schema to be ready..." sleep 30 attempts=$((attempts + 1)) done echo "Timeout waiting for schema status" exit 1 only: - main commit_or_abandon_schema: stage: commit_or_abandon_schema script: - npm install -g fauna-shell - | if [ $? -eq 0 ]; then fauna schema commit --secret $FAUNA_SECRET --no-input else fauna schema abandon --secret $FAUNA_SECRET --no-input exit 1 fi only: - main ``` ## [](#multi-tenancy)Manage schema for child databases Fauna databases support a hierarchical database structure with top-level and child databases. You can manage schema for child databases using: * The [Fauna CLI](#cli) * [FQL schema methods](#fql) * The [Schema HTTP API](#http) scoped keys with the CLI or using FQL schema methods. ### [](#cli)Use scoped keys with the CLI The Fauna CLI’s [`fauna schema`](../../../build/cli/v4/commands/schema/) commands let you specify a `--database` when you use [interactive login](../../../build/cli/v4/#interactive) or an [account key](../../../build/cli/v4/#account-key) for authentication. You can use `--database` to interact with any child database the related account key has access to. ```cli # Stage schema changes for the # 'us/parent_db/child_db' database. fauna schema push \ --database us/parent_db/child_db \ --dir /path/to/schema/dir ``` Alternatively, you can use `--secret` to provide a [scoped key](../../security/keys/#scoped-keys). A scoped key lets you interact with a child database’s schema using a parent database’s admin key. For example, with a parent database’s admin key secret, you can access a child database by appending the child database name and role: ```cli # Use a scoped key from a parent database # to stage schema in the 'child_db' child database. # The scoped key has `admin` privileges. fauna schema push \ --secret fn123:child_db:admin \ --dir /path/to/schema/dir ``` ### [](#fql)Use FQL schema methods Fauna stores each schema for a database as an FQL document in a related [system collection](../../data-model/collections/#system-coll). You can use methods for these system collections to programmatically manage the schema of child databases using FQL queries. Use a [scoped key](../../security/keys/#scoped-keys) to manage a child database’s schema using queries in a parent database. | FSL schema | FQL system collection | | --- | --- | --- | --- | | Access provider schema | AccessProvider collection | | Collection schema | Collection collection | | Function schema | Function collection | | Role schema | Role collection | ### [](#http)Use Schema HTTP API endpoints You can use the Fauna Core HTTP API’s [Schema endpoints](../../../reference/http/reference/core-api/#tag/Schema) to perform programmatically perform schema changes, including [staged schema changes](#staged). Use a [scoped key](../../security/keys/#scoped-keys) to manage a child database’s schema using a parent database’s key secret. For example, the following [Update schema files](../../../reference/http/reference/core-api/#operation/update) request uses a scoped key that impersonates a key with the `admin` role for the `childDb` child database. The request starts a staged schema change for the `childDb` child database: ```bash curl -X POST "https://db.fauna.com/schema/1/update?staged=true" \ -H "Authorization: Bearer $FAUNA_SECRET:childDb:admin" \ -H "Content-Type: multipart/form-data" \ -F "collections.fsl=@./schema/collections.fsl" \ -F "functions.fsl=@./schema/functions.fsl" ``` # Query data with FQL | Reference: FQL language reference, FQL API reference | | --- | --- | --- | Fauna Query Language (FQL) is a TypeScript-like language used to read and write data in Fauna. You can use FQL to build type-safe, composable queries that combine the flexibility of JSON-like documents with the safety of static typing. ```fql // Calls the `all()` method on the `Product` collection. // Returns the `name`, `description`, and `price` fields of // all `Product` collection documents. Product.all() { name, description, price } ``` Fauna stores data as JSON-like [documents](../data-model/documents/), organized into [collections](../data-model/collections/). Queries can filter and fetch documents from a collection as a [set](../data-model/sets/), and iterate through each document. ## [](#run-queries)Run queries ### [](#http-api-and-client-drivers)HTTP API and client drivers In an application, you typically run queries by working directly with the [Fauna Core HTTP API](../../reference/http/reference/core-api/) or using a Fauna [client driver](../../build/drivers/) for your preferred programming language: ![JavaScript](../../build/_images/drivers/logos/javascript.svg) [JavaScript](../../build/drivers/js-client/) ![Python](../../build/_images/drivers/logos/python.svg) [Python](../../build/drivers/py-client/) ![Go](../../build/_images/drivers/logos/golang.svg) [Go](../../build/drivers/go-client/) ![C#](../../build/_images/drivers/logos/csharp.svg) [.NET/C#](../../build/drivers/dotnet-client/) ![Java](../../build/_images/drivers/logos/java.svg) [Java](../../build/drivers/jvm-client/) ### [](#fauna-dashboard)Fauna Dashboard For testing or one-off queries, you can use the [Fauna Dashboard](https://dashboard.fauna.com/) Shell to run FQL queries: ![Run an FQL query in the Dashboard Shell](../../get-started/_images/dashboard-shell-query.gif) ### [](#fauna-cli)Fauna CLI You can also use the [Fauna CLI](../../build/cli/v4/) to run FQL queries from [files](../../build/cli/v4/commands/query/) or in an [interactive REPL](../../build/cli/v4/commands/shell/). ![Run an FQL query in the Dashboard Shell](../_images/fauna-cli-eval.gif) | See Fauna CLI v4 | | --- | --- | --- | ## [](#basic-operations)Basic operations For a examples of covering basic operations using FQL queries, including document CRUD operations, see [CRUD and basic operations](patterns/basic-ops/). For other common FQL query patterns, see [FQL query patterns](patterns/). ## [](#method-chaining)Method chaining You typically compose an FQL query by chaining methods to one or more collections. Expressions are evaluated from left to right. ```fql // `firstWhere()` gets the first `Product` collection document with // a `name` of `cups`. `update()` updates the document's `name` // field value to `clear cups`. The `?.` operator only calls // `update()` if `firstWhere()` returns a non-null value. Product.firstWhere(.name == "cups")?.update({ name: "clear cups" }) ``` | See Field accessors and method chaining | | --- | --- | --- | ## [](#static-typing)Static typing FQL is statically typed. Every expression has a data type that’s checked before evaluation. If Fauna detects a type mismatch, it rejects the query with an error. Type checking helps catch errors early and consistently, saving you time during development. | See Static typing | | --- | --- | --- | ## [](#indexes)Indexes An index stores, or covers, specific document field values for quick retrieval. You can use indexes to filter and sort a collection’s documents in a performant way. Unindexed queries should be avoided. | See Indexes | | --- | --- | --- | ## [](#relationships)Relationships You create relationships by including a reference to a document in another document. This lets you model complex data structures without duplicating data across collections. ```fql // Get a `Category` collection document. let produce = Category.byName("produce").first() // Create a `Product` collection document that references // the `Category` collection document in the `category` field. Product.create({ name: "key lime", description: "Organic, 1 ct", price: 79, category: produce, stock: 2000 }) ``` | See Model relationships using document references | | --- | --- | --- | ## [](#projection)Projection Projection lets you return only the specific fields you want from queries. You can project results to dynamically resolve document relationships. This lets you resolve multiple deeply nested relationships in a single query. ```fql // Uses projection to only return each document's `name`, // `description`, and `category` fields. The `category` field // contains a resolved `Category` collection document. Product.byName("key lime") { name, description, category } ``` | See Projection and field aliasing | | --- | --- | --- | ## [](#pagination)Pagination Fauna provides pagination for queries that return large datasets. Fauna client drivers include methods for iterating through paginated query results. | See Pagination | | --- | --- | --- | ## [](#query-composition)Query composition The [Fauna client drivers](../../build/drivers/) compose queries using template strings. You can interpolate variables, including other FQL template stings, to build dynamic queries. For example, using the [JavaScript driver](../../build/drivers/js-client/): ```javascript import { Client, fql } from "fauna"; const client = new Client({ secret: 'FAUNA_SECRET' }); const minPrice = 5_00; const maxPrice = 50_00; const query = fql` Product.where(.price >= ${minPrice} && .price <= ${maxPrice}) `; let response = await client.query(query); console.log(response.data); client.close(); ``` | See Query composition | | --- | --- | --- | ## [](#performance-hints)Performance hints [Performance hints](performance-hints/) provide actionable steps for improving an FQL query’s performance. You typically use performance hints when testing or prototyping queries in the [Fauna Dashboard](https://dashboard.fauna.com/) Shell. ![Enable performance hints in the Dashboard](../_images/perf-hint-ex.gif) | See Performance hints | | --- | --- | --- | # Static typing | See Static typing | | --- | --- | --- | FQL is a statically typed query language. Every value has a specific [data type](../../../reference/fql/types/). You can only call FQL methods that are valid for a value’s type. Fauna’s type checking feature helps you catch FQL type mismatches before evaluation. This helps prevent runtime errors that can be hard to diagnose and fix. ## [](#type-checking-by-example)Type checking by example The following examples show Fauna handles FQL queries with type checking enabled and disabled. Examples use the [Fauna Dashboard](https://dashboard.fauna.com/)'s demo data. 1. In the [Fauna Dashboard](https://dashboard.fauna.com/), ensure type checking is enabled. 2. Run the following FQL query to create a document in the `Product` collection. ```fql let produce = Category.byName("produce").first() // Create a document with a specific `id` Product.create({ id: "392886847463751747", name: "key limes", description: "Organic, 1 ct", price: 79, stock: 2000, category: produce }) ``` The query uses [`collection.create()`](../../../reference/fql-api/collection/instance-create/) to specify a document `id`. This lets you more easily fetch the document in later examples. 3. With type checking enabled, attempt to update the document using the following query. The query: * Uses [`collection.byId()`](../../../reference/fql-api/collection/instance-byid/) to fetch the document you created. * Calls [`document.update()`](../../../reference/fql-api/document/update/) on the returned document. ```fql // Update the document Product.byId("392886847463751747") .update({ description: "Organic, 2 ct", }) ``` The query is unsafe and returns an error: ``` invalid_query: The query failed 1 validation check error: Type `Null` does not have field `update` at *query*:3:4 | 3 | .update({ | ^^^^^^ | hint: Use the ! or ?. operator to handle the null case at *query*:2:35 | 2 | Product.byId("392886847463751747")! | + | ``` [`collection.byId()`](../../../reference/fql-api/collection/instance-byid/) returns a [Document](../../../reference/fql/types/#document) or a [NullDoc](../../../reference/fql/types/#nulldoc), which indicates the requested document doesn’t exist. [NullDoc](../../../reference/fql/types/#nulldoc)s always contain a `null` value. The query passes this value to [`document.update()`](../../../reference/fql-api/document/update/), which only accepts a [Document](../../../reference/fql/types/#document). The query doesn’t safely handle cases in which [`collection.byId()`](../../../reference/fql-api/collection/instance-byid/) may return a [NullDoc](../../../reference/fql/types/#nulldoc). To fix this, you can append one of the following postfix operators to [`collection.byId()`](../../../reference/fql-api/collection/instance-byid/): * `?.` ([optional chaining](../../../reference/fql/operators/#optional-chaining)), which only calls [`document.update()`](../../../reference/fql-api/document/update/) if [`collection.byId()`](../../../reference/fql-api/collection/instance-byid/) returns a value other than `null`. * `!` ([non-null assertion](../../../reference/fql/operators/#non-null)), which asserts that [`collection.byId()`](../../../reference/fql-api/collection/instance-byid/) will not return a `null` value. 4. Add the `?.` postfix operator and rerun the query: ```fql // Update the document Product.byId("392886847463751747") ?.update({ description: "Organic, 2 ct", }) ``` The query now runs successfully. 5. In the Dashboard Shell, click **Typecheck** to disable type checking. 6. Rerun the previous unsafe query: ```fql // Update the document Product.byId("392886847463751747") .update({ description: "Organic, 2 ct", }) ``` With type checking disabled, the unsafe query runs successfully. While this may seem more convenient, it adds risk. Now, the query won’t return an error until [`collection.byId()`](../../../reference/fql-api/collection/instance-byid/) returns a [NullDoc](../../../reference/fql/types/#nulldoc). If this does happen, it’s likely to occur later in the development process — possibly when the application is already in production. 7. To test this out, delete the document and rerun the unsafe query with type checking disabled: ```fql // Delete the document Product.byId("392886847463751747") .delete() // Attempt to update the deleted document Product.byId("392886847463751747") .update({ description: "Organic, 2 ct", }) ``` The query now returns an error: ``` document_not_found: Collection `Product` does not contain document with id 392886847463751747. error: Collection `Product` does not contain document with id 392886847463751747. at *query*:6:13 | 6 | Product.byId("392886847463751747") | ^^^^^^^^^^^^^^^^^^^^^^ | ``` ## [](#enable-database)Enable type checking for a database When you can create a database in the Fauna Dashboard, you can enable type checking using the **Static Typing** option. If wanted, you can later edit the database to change this option. ### [](#enable-type-checking-for-a-child-database)Enable type checking for a child database You can enable type checking for a child database using [`.update()`](../../../reference/fql-api/database/update/): ```fql Database.byName( "childDB" ) ?.update( { typechecked: true } ) ``` The `typecheck` property set using a [Fauna client driver](#enable-driver) overrides this setting. ### [](#enable-driver)Enable type checking in the driver You can enable type checking in [Fauna’s client drivers](../../../build/drivers/) at two levels: * Configure the driver’s client instance to enable type checking on all queries by default. For example: ```javascript ... const clientConfig = { typecheck: true, ... }; const client = new Client(clientConfig); ... ``` * Configure type checking on specific query calls. For example: ```javascript ... const queryOptions = { typecheck: true, ... }; const response = await client.query(fql`"Hello World!"`, queryOptions); ... ``` To use type checking in a driver, type checking must be [enabled in the database](#enable-database). ## [](#type-checking-for-udfs-and-other-database-entities)Type checking for UDFs and other database entities If you disable type checking for a database, Fauna does not type check the database’s: * User-defined function (UDF) definitions * FQL expressions, such as predicate functions, in role and access provider definitions # Pagination Pagination lets you iterate through large [Sets](../../data-model/sets/) returned by a query. This guide covers default pagination, customizing page size, and accessing paginated results within FQL queries. ## [](#cursor)Default pagination Fauna automatically paginates result Sets with 16 or more elements. When a query returns paginated results, Fauna materializes a subset of the Set with an `after` pagination cursor: ```fql // Uses the `Product` collection's `sortedByPriceLowToHigh()` index to // return all `Product` collection documents. // The collection contains more than 16 documents. Product.sortedByPriceLowToHigh() ``` ``` { // The result Set contains 16 elements. data: [ { id: "555", coll: Product, ts: Time("2099-07-30T15:57:03.730Z"), name: "single lime", description: "Conventional, 1 ct", price: 35, stock: 1000, category: Category("789") }, { id: "888", coll: Product, ts: Time("2099-07-30T15:57:03.730Z"), name: "cilantro", description: "Organic, 1 bunch", price: 149, stock: 100, category: Category("789") }, ... ], // Use the `after` cursor to get the next page of results. after: "hdW..." } ``` ## [](#customize-page-size)Customize page size | Reference: set.pageSize() | | --- | --- | --- | Use [`set.pageSize()`](../../../reference/fql-api/set/pagesize/) to change the maximum number of elements per page: ```fql // Calls `pageSize()` with a size of `2`. Product.sortedByPriceLowToHigh().pageSize(2) ``` ``` { // The result Set contains two elements or fewer. data: [ { id: "555", coll: Product, ts: Time("2099-07-30T15:57:03.730Z"), name: "single lime", description: "Conventional, 1 ct", price: 35, stock: 1000, category: Category("789") }, { id: "888", coll: Product, ts: Time("2099-07-30T15:57:03.730Z"), name: "cilantro", description: "Organic, 1 bunch", price: 149, stock: 100, category: Category("789") } ], after: "hdaExad..." } ``` You should typically place `pageSize()` last in a method chain. `pageSize()` only affects the rendering of a Set, not subsequent operations. Methods chained to `pageSize()` access the entire calling Set, not a page of results. ## [](#iterate-through-pages)Iterate through pages | Reference: Set.paginate() | | --- | --- | --- | To iterate through paginated results, pass the `after` cursor to [`Set.paginate()`](../../../reference/fql-api/set/static-paginate/): ```fql Set.paginate("hdW...") ``` ### [](#example-implementation)Example implementation The following example shows how you can iterate through paginated results using the JavaScript driver: ```javascript import { Client, fql } from "fauna"; const client = new Client({ secret: '' }); // Defines a function that accepts a Fauna `after` cursor as // an optional argument. async function getProducts(afterCursor) { // Initial FQL query. const query = fql`Product.sortedByPriceLowToHigh().pageSize(2)`; const response = await client.query( // If an `after` cursor is provided, use `Set.paginate()` // to get the next page of results. // Otherwise, use the initial FQL query. afterCursor ? fql`Set.paginate(${afterCursor})` : query ); const data = response.data.data; const nextCursor = response.data.after; // Print the results and the after cursor. console.log("Data:", data); console.log("Next cursor:", nextCursor); return { data, nextCursor }; } // Defines a function to loop through paginated results. async function getAllProducts() { let afterCursor; do { const { data, nextCursor } = await getProducts(afterCursor); afterCursor = nextCursor; } while (afterCursor); } // Call the function to loop through the results. getAllProducts(); ``` ### [](#driver)Driver pagination methods The Fauna client drivers also include methods for automatically iterating through pages. See: * [JavaScript driver docs](../../../build/drivers/js-client/#pagination) * [Python driver docs](../../../build/drivers/py-client/#pagination) * [Go driver docs](../../../build/drivers/go-client/#pagination) * [.NET/C# driver docs](../../../build/drivers/dotnet-client/#pagination) * [JVM driver docs](../../../build/drivers/jvm-client/#pagination) ## [](#access-pages-and-cursors-within-a-query)Access pages and cursors within a query | Reference: set.paginate() | | --- | --- | --- | If you need to access an `after` cursor or paginated results within an FQL query, use [`set.paginate()`](../../../reference/fql-api/set/paginate/): ```fql Product.sortedByPriceLowToHigh().pageSize(2).paginate() ``` For example, you can use `paginate()` to return the `after` cursor for use as a URL in a client application. Alternatively, you can use `paginate()` to iteratively update a large Set of collection documents over several queries. For an example, see the [`paginate()` reference docs](../../../reference/fql-api/set/paginate/#examples). ### [](#considerations-for-paginate)Considerations for `paginate()` `paginate()` accepts an optional argument to control page size. In most cases, you should not use `paginate()` in place of `pageSize()`. The following table outlines differences between [`set.pageSize()`](../../../reference/fql-api/set/pagesize/) and [`set.paginate()`](../../../reference/fql-api/set/paginate/): | Difference | set.pageSize() | set.paginate() | | --- | --- | --- | --- | --- | | Use case | Use in most cases. | Use when needing to access an 'after' cursor or paginated results within an FQL query. | | Return type | Returns a set. | Returns an object. | | Loading strategy | Lazy loading. Only fetches results as needed. | Eager loading. Fetches results instantly, even if the results aren’t returned or used. | | Client driver methods | Compatible with driver pagination methods. | Incompatible with driver pagination methods. | | Projection | Supports projections. | Doesn’t support projections. | | Set instance methods | Supports set instance methods. | Doesn’t support set instance methods. | ## [](#cursor-state-expiration)Cursor state and expiration If a paginated Set contains documents, the `after` cursor fetches [historical snapshots](../../doc-history/) of the documents at the time of the original query. You can control the retention of document snapshots using the collection schema’s [`history_days` setting](../../doc-history/#history-retention). An `after` cursor is valid for `history_days` plus 15 minutes. If `history_days` is `0` or unset, the cursor is valid for 15 minutes. If a document’s snapshot is no longer available, a [NullDoc](../../../reference/fql/types/#nulldoc) is returned instead: ``` { data: [ { id: "393605620096303168", coll: Product, ts: Time("2099-03-28T12:53:40.750Z"), name: "limes", ... }, Product("401942927818883138") /* not found */ ], after: "hdWCxmd..." } ``` | See Document history | | --- | --- | --- | ### [](#invalid-cursor)Invalid cursor If you pass an invalid or expired `after` cursor to [`Set.paginate()`](../../../reference/fql-api/set/static-paginate/), the Fauna Core HTTP API’s [Query endpoint](../../../reference/http/reference/core-api/#operation/query) returns a query runtime error with an `invalid_cursor` [error code](../../../reference/http/reference/errors/) and a 400 HTTP status code: ``` { "error": { "code": "invalid_cursor", "message": "Cursor cursor is invalid or expired." }, ... } ``` Fauna’s client drivers include classes for query runtime errors: * JavaScript driver: [`QueryRuntimeError`](https://fauna.github.io/fauna-js/latest/classes/QueryRuntimeError.html) * Python driver: [`QueryRuntimeError`](https://fauna.github.io/fauna-python/latest/api/fauna/errors/errors.html#QueryRuntimeError) * Go driver: [`ErrQueryRuntime ¶`](https://pkg.go.dev/github.com/fauna/fauna-go/v2#ErrQueryRuntime) * .NET/C# driver: [`QueryRuntimeException`](https://fauna.github.io/fauna-dotnet/latest/class_fauna_1_1_exceptions_1_1_query_runtime_exception.html) * JVM driver: [`QueryRuntimeException`](https://fauna.github.io/fauna-jvm/latest/com/fauna/exception/QueryRuntimeException.html) ### [](#cursor-handling-and-best-practices)Cursor handling and best practices When working with `after` cursors, keep the following in mind: * **Encoding:** `after` cursors can contain special characters, such as `+`. If you’re using an after cursor in a URL or as a query parameter, make sure to URL-encode the cursor value. This ensures that the cursor is properly transmitted and interpreted. Generally, we don’t recommend you use `after` cursors for user-facing pagination. See [UI pagination](#ui). * **Cursor integrity:** You shouldn’t directly change or manipulate `after` cursor values. The cursor is a string-encoded hash that contains all information required to get the next page in the Set. Any modification to the cursor could result in unexpected behavior or errors when retrieving the next page. ## [](#reverse)Paginate in reverse Paginated queries don’t include a `before` cursor. Instead, you can use a range search and document IDs or other unique field values to paginate in reverse. For example: 1. Run an initial paginated query: ```fql Product.all().pageSize(2) ``` ``` { data: [ { id: "111", coll: Product, ts: Time("2099-08-16T14:00:59.075Z"), name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698, stock: 100, category: Category("123") }, { id: "222", coll: Product, ts: Time("2099-08-16T14:00:59.075Z"), name: "donkey pinata", description: "Original Classic Donkey Pinata", price: 2499, stock: 50, category: Category("123") } ], after: "hdW..." } ``` 2. Page forward until you find the document you want to start reversing from: ```fql Set.paginate("hdW...") ``` Copy the ID of the document: ``` { data: [ { id: "333", coll: Product, ts: Time("2099-08-16T14:00:59.075Z"), name: "pizza", description: "Frozen Cheese", price: 499, stock: 100, category: Category("456") }, { // Begin reverse pagination from this doc ID. id: "444", coll: Product, ts: Time("2099-08-16T14:00:59.075Z"), name: "avocados", description: "Conventional Hass, 4ct bag", price: 399, stock: 1000, category: Category("789") } ], after: "hdW..." } ``` 3. To reverse paginate, run the original query with: * A range search with a `to` argument containing the previous document ID. * [`set.reverse()`](../../../reference/fql-api/set/reverse/): Append this to the query. * [`set.pageSize()`](../../../reference/fql-api/set/pagesize/): If used, place it after [`set.reverse()`](../../../reference/fql-api/set/reverse/). ```fql // "444" is the ID of the document to reverse from. Product.all({ to: "444" }).reverse().pageSize(2) ``` ``` { data: [ { // The results of the previous query are reversed. id: "444", coll: Product, ts: Time("2099-08-16T14:00:59.075Z"), name: "avocados", description: "Conventional Hass, 4ct bag", price: 399, stock: 1000, category: Category("789") }, { id: "333", coll: Product, ts: Time("2099-08-16T14:00:59.075Z"), name: "pizza", description: "Frozen Cheese", price: 499, stock: 100, category: Category("456") } ], after: "hdW..." } ``` To get historical snapshots of documents at the time of the original query, use an [`at` expression](../../../reference/fql/statements/#at): ```fql // Time of the original query. let originalQueryTime = Time.fromString("2099-08-16T14:30:00.000Z") at (originalQueryTime) { // "444" is the ID of the document to reverse from. Product.all({ to: "444" }).reverse().pageSize(2) } ``` 4. Repeat the previous step to continue paginating in reverse: ```fql Product.all({ to: "333" }).reverse().pageSize(2) ``` ## [](#ui)UI pagination Fauna’s pagination functionality is designed to help client apps consume large result Sets at a controlled pace. It isn’t intended for use in user-facing pagination. To implement UI pagination in your app, we recommend you use [range searches](../../data-model/indexes/#range-search) instead. Range searches give you precise control over returned data and consistent results. ### [](#build-ui-pagination-with-ranged-searches)Build UI pagination with ranged searches For example, an e-commerce application must present users with a paginated list of products. Products are sorted by ascending price with up to five products per page. To implement UI pagination using ranged searches: 1. Define an index in the `Product` collection schema: ```fsl collection Product { ... index inOrderOfPrice { // `values` are document fields for sorting and range searches. // In this example, you sort and filter index results by their // ascending `price` field value. Include the document `id` // to handle cases where multiple products have the same price. values [ .price, .id ] } } ``` Submit the updated schema to Fauna using the [Fauna Dashboard](https://dashboard.fauna.com/) or the [Fauna CLI](../../../build/cli/v4/)'s [`fauna schema push`](../../../build/cli/v4/commands/schema/push/) command. 2. To get the first page of results, the app runs the following FQL query: ```fql // Gets the first page of products, // starting with a price of `0`. Product.inOrderOfPrice({ from: [0] })! .take(5 + 1) // PAGE_SIZE + 1 .toArray() { id, name, price } ``` The result array includes six items, but the app UI only displays the first five. The presence of a sixth item indicates a "next" page is available. ``` [ { id: "555", name: "single lime", price: 35 }, { id: "888", name: "cilantro", price: 149 }, { id: "777", name: "limes", price: 299 }, { id: "666", name: "organic limes", price: 349 }, { id: "444", name: "avocados", price: 399 }, { id: "333", name: "pizza", price: 499 } ] ``` 3. To page forward: * Use the `price` and `id` of the last item from the previous results. * Take `PAGE_SIZE` + 1 items. ```fql // Gets the next page of products, starting with // the `price` (499) and document `id` of last item // from the previous results. Product.inOrderOfPrice({ from: [499, "333"] })! .take(5 + 1) // PAGE_SIZE + 1 .toArray() { id, name, price } ``` Like the previous query, the app UI only displays the first five items in the results. The presence of a sixth item indicates a "next" page is available. ``` [ { id: "333", name: "pizza", price: 499 }, { id: "111", name: "cups", price: 698 }, { id: "999", name: "taco pinata", price: 2399 }, { id: "222", name: "donkey pinata", price: 2499 }, { id: "123", name: "gorilla pinata", price: 2599 }, { id: "456", name: "giraffe pinata", price: 2799 } ] ``` 4. To page backward: * Pass the `price` and `id` of the first item from the previous results to `to` in the ranged search. * Call [`set.reverse()`](../../../reference/fql-api/set/reverse/) on the Set. * Take `PAGE_SIZE` + 2 items. * Drop the first item, which is a duplicate from the last page. Range searches are inclusive. ```fql // Gets the previous page of products, starting with // the `price` (499) and document `id` of the first // item from the previous results. Product.inOrderOfPrice({ to: [499, "333"] })! .reverse() .take(5 + 2) // PAGE_SIZE + 2 .drop(1) .toArray() { id, name, price } ``` Similar to the previous queries, the app UI only displays the first five items in the results. The presence of a sixth item indicates a "previous" page is available. In this example, no "previous" page is available. ``` [ { id: "444", name: "avocados", price: 399 }, { id: "666", name: "organic limes", price: 349 }, { id: "777", name: "limes", price: 299 }, { id: "888", name: "cilantro", price: 149 }, { id: "555", name: "single lime", price: 35 } ] ``` ### [](#jump-to-a-page-using-an-offset)Jump to a page using an offset To jump to a specific page, you can specify an offset using [`set.take()`](../../../reference/fql-api/set/take/) and [`set.drop()`](../../../reference/fql-api/set/drop/): ```fql // Jumps to page 3 (PAGE_NUM = 3) with each // page containing 5 items (PAGE_SIZE = 5). Product.inOrderOfPrice()! .take((5 * 3) + 1) // (PAGE_SIZE * PAGE_NUM) + 1 .drop(5 * (3 - 1)) // PAGE_SIZE * (PAGE_NUM - 1) .toArray() { id, name, price } ``` ``` [ { id: "456", name: "giraffe pinata", price: 2799 }, { id: "789", name: "whale pinata", price: 3299 }, { id: "234", name: "tablet", price: 3599 }, { id: "567", name: "drone", price: 3799 }, { id: "890", name: "smartphone", price: 15999 }, { id: "345", name: "laptop", price: 24999 } ] ``` ### [](#consistent-results-across-pages)Consistent results across pages By default, queries run against the most recent version of documents. If the documents change during pagination — for example, if you create or delete documents — results may not be consistent across queries. To ensure consistent results, use [`at` expression](../../../reference/fql/statements/#at) to get historical snapshots of documents at the time of the original query: ```fql // Time of the original query. let originalQueryTime = Time.fromString("2099-08-16T14:30:00.000Z") at (originalQueryTime) { Product.inOrderOfPrice({ to: [499, "333"] })! .reverse() .take(5 + 2) // PAGE_SIZE + 2 .drop(1) .toArray() { id, name, price } } ``` # Query composition The [Fauna client drivers](../../../build/drivers/) compose queries using FQL template strings. You can interpolate variables, including other FQL template strings, into the template strings to compose dynamic queries. For example, using the [JavaScript driver](../../../build/drivers/js-client/): ```javascript import { Client, fql, FaunaError } from "fauna"; const client = new Client({ secret: 'FAUNA_SECRET' }); // Create a native JS var. const name = "avocados"; // Pass the var to an FQL query. let query = fql`Product.where(.name == ${name})`; let response = await client.query( query ); console.log(response.data); client.close(); ``` To prevent injection attacks, the drivers use the [wire protocol](../../../reference/http/reference/wire-protocol/) encode interpolated variables to an appropriate [FQL type](../../../reference/fql/types/) before passing the query to the Core HTTP API’s [Query endpoint](../../../reference/http/reference/core-api/#operation/query). ## [](#example-product-catalog-search)Example: Product catalog search An e-commerce app has a product catalog that users search by various criteria, such as product category, price range, or other properties. A user may search by product type, minimum price, maximum price, or any combination of these criteria. The following dynamic query handles various combinations of these search parameters: ```javascript import { Client, fql, FaunaError } from "fauna"; const client = new Client({ secret: 'FAUNA_SECRET' }); // Set up parameterized inclusive, exclusive, // or exact matches. const operators = { eq: (field, value) => fql`x => x[${field}] == ${value}`, gt: (field, value) => fql`x => x[${field}] > ${value}`, gte: (field, value) => fql`x => x[${field}] >= ${value}`, lt: (field, value) => fql`x => x[${field}] < ${value}`, lte: (field, value) => fql`x => x[${field}] <= ${value}`, } // Define the product search parameters set in the UI. const predicates = [ { field: "type", operator: "eq", value: "food" }, { field: "price", operator: "lte", value: 10000 }, { field: "price", operator: "gte", value: 10 }, ]; const base_query = fql`Product` // Chain successive `where()` methods to the query, // effectively ANDing the predicates. const dynamicQuery = predicates.reduce( (acc, val) => fql`${acc}.where( ${operators[val.operator] ( val.field, val.value)})`, base_query ) let response = await client.query(dynamicQuery) console.log(response.data) client.close(); ``` For simplicity, the example uses [`collection.where()`](../../../reference/fql-api/collection/instance-where/), which requires a read of each document and isn’t performant on large datasets. For better performance, use [indexes](../../data-model/indexes/) instead. ## [](#dynamic-filtering-using-advanced-query-composition)Dynamic filtering using advanced query composition Complex applications may need to handle arbitrary combinations of search criteria. In these cases, you can use [query composition](./) to dynamically apply [indexes](../../data-model/indexes/) and [filters](../patterns/sets/#filters) to queries. The following template uses query composition to: * Automatically select the most selective index * Apply remaining criteria as filters in priority order * Support both index-based and filter-based search patterns The template uses TypeScript and the [JavaScript driver](../../../build/drivers/js-client/). A similar approach can be used with any [Fauna client driver](../../../build/drivers/). ```typescript /** * A Javascript object with a sorted list of indexes or filters. * * Javascript maintains key order for objects. * Sort items in the map from most to least selective. */ type QueryMap = Record Query> /** Object to represent a search argument. * * Contains the name of the index to use and the arguments * to pass to it. * * Example: * { name: "by_name", args: ["limes"] } * { name: "range_price", args: [{ from: 100, to: 500 }] } */ type SearchTerm = { name: string args: any[] } /** * Composes a query by prioritizing the most selective index and then * applying filters. * * @param default_query - The initial query to which indexes and filters are applied. * @param index_map - A map of index names to functions that generate query components. * @param filter_map - A map of filter names to functions that generate query components. * @param search_terms - An array of search terms that specify the type and arguments * for composing the query. * @returns The composed query after applying all relevant indices and filters. */ const build_search = ( default_query: Query, index_map: QueryMap, filter_map: QueryMap, search_terms: SearchTerm[] ): Query => { const _search_terms = [...search_terms] // Initialize a default query. Used if no other indexes are applicable. let query: Query = default_query // Iterate through the index map, from most to least selective. build_index_query: for (const index_name of Object.keys( index_map )) { // Iterate through each search term to check if it matches the highest priority index. for (const search_term of _search_terms) { // If a match is found, update the query. Then remove the search term from the // list and break out of the loop. if (index_name === search_term.name) { query = index_map[search_term.name](...search_term.args) _search_terms.splice(_search_terms.indexOf(search_term), 1) break build_index_query } } } // Iterate through the filter map, from most to least selective. for (const filter_name of Object.keys(filter_map)) { // Iterate through each search term to check if it matches the highest priority filter. for (const search_term of _search_terms) { // If a match is found, update the query. Then remove the search term from the list. if (filter_name === search_term.name) { const filter = filter_map[search_term.name](...search_term.args) query = fql`${query}${filter}` _search_terms.splice(_search_terms.indexOf(search_term), 1) } } } // If there are remaining search terms, you can't build the full query. if (_search_terms.length > 0) { throw new Error("Unable to build query") } return query } ``` The following example implements the template using the [Fauna Dashboard](https://dashboard.fauna.com/)'s demo data: ```typescript // Implementation of `index_map` from the template. // Sort items in the map from most to least selective. const product_index_priority_map: QueryMap = { by_order: (id: string) => fql`Order.byId(${id})!.items.map(.product!)`, by_name: (name: string) => fql`Product.byName(${name})`, by_category: (category: string) => fql`Product.byCategory(Category.byName(${category}).first()!)`, range_price: (range: { from?: number; to?: number }) => fql`Product.sortedByPriceLowToHigh(${range})`, } // Implementation of `filter_map` from the template. // Sort items in the map from most to least selective. const product_filter_map: QueryMap = { by_name: (name: string) => fql`.where(.name == ${name})`, by_category: (category: string) => fql`.where(.category == Category.byName(${category}).first()!)`, range_price: ({ from, to }: { from?: number; to?: number }) => { // Dynamically filter products by price range. if (from && to) { return fql`.where(.price >= ${from} && .price <= ${to})` } else if (from) { return fql`.where(.price >= ${from})` } else if (to) { return fql`.where(.price <= ${to})` } return fql`` }, } // Hybrid implementation of `index_map` and `filter_map` from the template. // Combines filters and indexes to compose FQL query fragments. // Sort items in the map from most to least selective. const product_filter_with_indexes_map: QueryMap = { by_name: (name: string) => fql`.where(doc => Product.byName(${name}).includes(doc))`, by_category: (category: string) => fql`.where(doc => Product.byCategory(Category.byName(${category}).first()!).includes(doc))`, range_price: (range: { from?: number; to?: number }) => fql`.where(doc => Product.sortedByPriceLowToHigh(${range}).includes(doc))`, } const order_id = (await client.query(fql`Order.all().first()!`)) .data.id const query = build_search( fql`Product.all()`, product_index_priority_map, product_filter_with_indexes_map, [ // { type: "by", name: "name", args: ["limes"] }, // { type: "by", name: "category", args: ["produce"] }, { type: "range", name: "price", args: [{ to: 1000 }] }, { type: "by", name: "order", args: [order_id] }, ] ) const res = await client.query(query) ``` ## [](#driver-examples)Driver examples For additional query composition examples, see the driver documentation: ![JavaScript](../../../build/_images/drivers/logos/javascript.svg) [JavaScript](../../../build/drivers/js-client/#var) ![Python](../../../build/_images/drivers/logos/python.svg) [Python](../../../build/drivers/py-client/#var) ![Go](../../../build/_images/drivers/logos/golang.svg) [Go](../../../build/drivers/go-client/#var) ![C#](../../../build/_images/drivers/logos/csharp.svg) [.NET/C#](../../../build/drivers/dotnet-client/#var) ![Java](../../../build/_images/drivers/logos/java.svg) [Java](../../../build/drivers/jvm-client/#var) # Performance hints Performance hints provide actionable steps for improving an FQL query’s performance. You typically use performance hints when testing or prototyping queries in the [Fauna Dashboard](https://dashboard.fauna.com/) Shell or [Fauna CLI](../../../build/cli/v4/). ## [](#enable-performance-hints)Enable performance hints Performance hints are disabled by default. To enable hints in the Dashboard Shell, toggle **Performance Hints** . ![Enable performance hints in the Dashboard](../../_images/perf-hints-enable.gif) To enable performance hints in the CLI, use the [`fauna query`](../../../build/cli/v4/commands/query/) command’s `--performance-hints` flag: ```cli fauna query "Collection.all().distinct()" \ --database us/my_db \ --performance-hints ``` To enable performance hints for the CLI’s REPL, use the `.togglePerformanceHints` REPL command. You can access the REPL using [`fauna shell`](../../../build/cli/v4/commands/shell/). ## [](#example)Example As an example, Fauna emits performance hints for [uncovered index queries](../../data-model/indexes/#covered-queries). The following FQL query uses an index but projects fields that aren’t covered by the index definition: ```fql // This is an uncovered query. // `stock` is not one of the terms or values // in the `sortedByPriceLowToHigh()` index definition. Product.sortedByPriceLowToHigh() { name, stock } ``` In the Dashboard Shell, the result includes a performance hint, if enabled: ``` performance_hint: non_covered_document_read - .stock is not covered by the Product.sortedByPriceLowToHigh index. See https://docs.faunadb.org/performance_hint/non_covered_document_read. at *query*:6:3 | 6 | stock | ^^^^^ | { data: [ ... ] } ``` ## [](#perf-codes)Performance hint codes Each performance hint includes a leading code that indicates its type. The following table provides a description of each hint type, along with examples and resolutions to address the hint. | Hint code | Description | Examples and resolutions | | --- | --- | --- | --- | --- | | collection_scan | The query calls collection.firstWhere() or collection.where() on a collection. This can cause a read of every document in the collection.To address the hint, use an index with a terms to look up matching documents instead. | collection.firstWhere()collection.where() | | non_covered_document_read | Returned for uncovered index queries. The query uses an index but returns entire documents or field values not covered by the index.Consider adding the uncovered fields to the index definition’s values. Then use projection or mapping to return only the fields you need.Fauna does not return the non_covered_document_read performance hint for uncovered index queries that pass an mva() term. | Uncovered index queriesIndex queries with no projection or mapping | | full_set_read | The query calls an eager-loading method on a document Set. To address the hint, do one of the following:Use set.take() to explicitly limit the size of the calling Set to fewer than 100 documents. This applies even if the original, unbounded Set contains fewer than 100 documents.Rewrite the query to avoid calling the method.Alternatively, the query calls set.order() on a document Set, causing all documents in the Set to be read. If you frequently run such queries, consider adding the fields used for ordering to an index definition’s values. | set.aggregate()set.count()set.distinct()set.every()set.fold()set.foldRight()set.forEach()set.includes()set.order()set.reduce()set.reduceRight()set.toArray() | ## [](#x-performance-hints-header)`X-Performance-Hints` header Internally, performance hints are enabled and disabled using the [`X-Performance-Hints` request header](../../../reference/http/reference/query-summary/#perf) for the Core HTTP API’s [Query endpoint](../../../reference/http/reference/core-api/#operation/query). Performance hints are returned in the response’s `summary`. For more information, see [Query summary](../../../reference/http/reference/query-summary/#perf). ## [](#performance-hints-in-client-drivers)Performance hints in client drivers To reduce resource consumption, Fauna’s client drivers disable performance hints by default. To enable use in the Dashboard Shell, the Fauna JavaScript driver includes a configuration option for performance hints. See the [JavaScript driver API reference](https://fauna.github.io/fauna-js/latest/interfaces/ClientConfiguration.html#performance_hints). # FQL query patterns This section includes common FQL query patterns. ## [](#in-this-section)In this section [CRUD and basic operations](basic-ops/) Examples of common database operations, including document CRUD operations. [Bulk writes](bulk-writes/) Create, update, and delete collection documents in bulk using FQL queries. [Check a secret’s user-defined roles](check-secret-roles/) Use user-defined functions (UDFs) to get the user-defined roles assigned to an authentication secret. [Conditional operations](conditional/) Use [`if …​ else`](../../../reference/fql/statements/#if) statements to perform conditional operations in Fauna. [Work with dates and times](date-time/) Use FQL’s built-in methods to convert, search, and transform [Date](../../../reference/fql-api/date/) and [Time](../../../reference/fql-api/time/) values. [Geospatial search](geospatial-search/) Use Fauna to run searches in a provided geographic area using a bounding box. [Group by aggregations](get-unique-values/) Use FQL’s built-in `fold()` method to aggregate data in a way similar to SQL’s `GROUP BY` operation. [Group By: Aggregate data in Fauna](group-by/) Learn how to get unique field values from a Set of documents. [String search](string-search/) Use string searches to get documents based on a matching string or substring. # CRUD and basic operations This page contains examples of CRUD queries and other common database operations in Fauna Query Language (FQL) and Fauna Schema Language (FSL). ## [](#collections)Collections ### [](#create-a-collection)Create a collection To create a collection, add an FSL [collection schema](../../../../reference/fsl/collection/) to Fauna using a [staged schema change](../../../schema/manage-schema/#staged). ```fsl // Defines the `Customer` collection. collection Customer { // Field definitions. // Define the structure of the collection's documents. name: String email: String address: { street: String city: String state: String postalCode: String country: String } // Wildcard constraint. // Allows arbitrary ad hoc fields of any type. *: Any // If a collection schema has no field definitions // and no wildcard constraint, it has an implicit // wildcard constraint of `*: Any`. } ``` | See Schema | | --- | --- | --- | ### [](#edit-a-collection)Edit a collection Update a collection’s document type using a [zero-downtime schema migration](../../../schema/#schema-migrations): ```fsl collection Customer { name: String email: String address: { street: String city: String state: String postalCode: String country: String } // Adds the `points` field. Accepts `int` or `null` values. // Accepting `null` means the field is not required. points: Int? // Adds the `typeConflicts` field as a catch-all field for // existing `points` values that aren't `Int` or `null`. typeConflicts: { *: Any }? *: Any migrations { // Adds the `typeConflicts` field. add .typeConflicts // Adds the `points` field. add .points // Nests non-conforming `points` and `typeConflicts` // field values in the `typeConflicts` catch-all field. move_conflicts .typeConflicts } } ``` | See Schema migrations | | --- | --- | --- | ### [](#view-collections)View collections ```fql Collection.all() ``` | Reference: Collection.all() | | --- | --- | --- | ### [](#delete-a-collection)Delete a collection To delete a collection, delete its schema using any of the following: * The [Fauna CLI](../../../schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../../reference/http/reference/core-api/#tag/Schema) * The FQL [`collectionDef.delete()`](../../../../reference/fql-api/collection/delete/) method. Deleting a collection deletes its documents and indexes. ## [](#documents)Documents ### [](#create-a-document)Create a document ```fql // Creates a `Customer` collection document. Customer.create({ name: "John Doe", email: "john.doe@example.com", address: { street: "123 Main St", city: "San Francisco", state: "CA", postalCode: "12345", country: "United States" } }) ``` | Reference: collection.create() | | --- | --- | --- | ### [](#get-a-single-document)Get a single document ```fql // Gets a `Customer` collection document. // Replace `111` with a document `id`. Customer.byId("111") ``` | Reference: collection.byId() | | --- | --- | --- | ### [](#update-a-document)Update a document ```fql // Updates a `Customer` collection document. Customer.byId("111")?.update({ // Updates the existing `name` field value. name: "Jonathan Doe", // Adds new `points` field. points: 42 }) ``` | Reference: document.update() | | --- | --- | --- | ### [](#upsert-a-document)Upsert a document ```fql // Try to find an existing customer. // If the customer doesn't exist, returns `null`. let customer = Customer.byId("111") // Customer data to upsert let data = { name: "Alice Appleseed", email: "alice.appleseed@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } } // Create or update the customer // based on existence. if (customer == null) { Customer.create(data) } else { customer!.update(data) } ``` ### [](#remove-a-document-field)Remove a document field ```fql // Updates a `Customer` collection document. Customer.byId("111")?.update({ // Removes the `points` field. points: null }) ``` | Reference: document.update() | | --- | --- | --- | ### [](#replace-a-document)Replace a document ```fql // Replaces a `Customer` collection document. Customer.byId("111")?.replace({ name: "Jane Doe", email: "jane.doe@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } }) ``` | Reference: document.replace() | | --- | --- | --- | ### [](#delete-a-document)Delete a document ```fql // Deletes a `Customer` collection document. Customer.byId("111")?.delete() ``` | Reference: document.delete() | | --- | --- | --- | ### [](#bulk-writes)Bulk writes Use [`set.forEach()`](../../../../reference/fql-api/set/foreach/) to iteratively update each document in a Set: ```fql // Get a Set of `Customer` collection documents with an // `address` in the `state` of `DC`. let customers = Customer.where(.address?.state == "DC") // Use `forEach()` to update each document in the previous Set. customers.forEach(doc => doc.update({ address: { street: doc?.address?.street, city: doc?.address?.city, state: "District of Columbia", postalCode: doc?.address?.postalCode, country: doc?.address?.country, } })) // `forEach()` returns `null`. ``` For more examples, see [Bulk writes](../bulk-writes/). | Reference: set.forEach() | | --- | --- | --- | ## [](#indexes-and-reads)Indexes and reads ### [](#create-an-index)Create an index You define indexes in FSL as part of a [collection schema](../../../../reference/fsl/collection/): ```fsl collection Customer { ... index byEmail { // `terms` are document fields for exact match searches. // In this example, you get `Customer` collection documents // by their `email` field value. terms [.email] // `values` are document fields for sorting and range searches. // In this example, you sort or filter index results by their // descending `name` and `email` field values. values [.name, .email] } } ``` | See Define an index | | --- | --- | --- | ### [](#exact-match-search)Exact match search ```fql // Runs an unindexed query. Customer.where(.email == "alice.appleseed@example.com") ``` For better performance on large datasets, use an index with a term to run an [exact match search](../../../data-model/indexes/#terms). Define the index in the collection schema: ```fsl collection Customer { ... // Defines the `byEmail()` index for the `Customer` // collection. index byEmail { // Includes the `email` field as an index term. terms [.email] values [.name, .email] } } ``` You call an index as a method on its collection: ```fql // Uses the `Customer` collection's `byEmail()` index // to run an exact match search on an `email` field value. Customer.byEmail("alice.appleseed@example.com") ``` | See Run an exact match search | | --- | --- | --- | ### [](#sort-collection-documents)Sort collection documents ```fql // Runs an unindexed query. // Sorts `Product` collection documents by: // - `price` (ascending), then ... // - `name` (ascending), then ... // - `description` (ascending), then ... // - `stock` (ascending). Product.all().order(.price, .name, .description, .stock) ``` For better performance on large datasets, use an index with values to [sort collection documents](../../../data-model/indexes/#sort-documents). Define the index in the collection schema: ```fsl collection Product { ... // Defines the `sortedByPriceLowToHigh()` index. index sortedByPriceLowToHigh { // `values` are document fields for sorting and range searches. values [.price, .name, .description, .stock] } } ``` Call the index in a query: ```fql // Uses the `Product` collection's `sortedByPriceLowToHigh()` index // to sort `Product` collection documents by: // - `price` (ascending), then ... // - `name` (ascending), then ... // - `description` (ascending), then ... // - `stock` (ascending). Product.sortedByPriceLowToHigh() ``` | See Sort collection documents | | --- | --- | --- | ### [](#range-search)Range search ```fql // Runs an unindexed query. // Get `Product` collection documents with a `price` between // 10_00 and 100_00 (inclusive). Product.where(.price >= 10_00 && .price <= 100_00) .order(.price, .name, .description, .stock) ``` For better performance on large datasets, use an index with values to run [range searches](../../../data-model/indexes/#range-search) on collection documents, Define the index in the collection schema: ```fsl collection Product { ... // Defines the `sortedByPriceLowToHigh()` index. index sortedByPriceLowToHigh { // `values` are document fields for sorting and range searches. values [.price, .name, .description, .stock] } } ``` Call the index in a query: ```fql // Get `Product` collection documents with a `price` between // 10_00 and 100_00 (inclusive). Product.sortedByPriceLowToHigh({ from: 10_00, to: 100_00 }) ``` | See Run a range search | | --- | --- | --- | ### [](#projection)Projection ```fql // Projects the `name`, `description`, and `price` fields. Product.sortedByPriceLowToHigh() { name, description, price } ``` ``` { data: [ { name: "single lime", description: "Conventional, 1 ct", price: 35 }, { name: "cilantro", description: "Organic, 1 bunch", price: 149 }, ... ] } ``` | Reference: Projection and field aliasing | | --- | --- | --- | ### [](#paginate-results)Paginate results Fauna automatically paginates result Sets with 16 or more elements. When a query returns paginated results, Fauna materializes a subset of the Set with an `after` pagination cursor: ```fql // Uses the `Product` collection's `sortedByPriceLowToHigh()` index to // return all `Product` collection documents. // The collection contains more than 16 documents. Product.sortedByPriceLowToHigh() ``` ``` { // The result Set contains 16 elements. data: [ { id: "555", coll: Product, ts: Time("2099-07-30T15:57:03.730Z"), name: "single lime", description: "Conventional, 1 ct", price: 35, stock: 1000, category: Category("789") }, { id: "888", coll: Product, ts: Time("2099-07-30T15:57:03.730Z"), name: "cilantro", description: "Organic, 1 bunch", price: 149, stock: 100, category: Category("789") } ], // Use the `after` cursor to get the next page of results. after: "hdW..." } ``` To get the next page of results, pass the `after` cursor to [`Set.paginate()`](../../../../reference/fql-api/set/static-paginate/): ```fql Set.paginate("hdW...") ``` The Fauna client drivers also include methods for automatically iterating through pages. See: * [JavaScript driver docs](../../../../build/drivers/js-client/#pagination) * [Python driver docs](../../../../build/drivers/py-client/#pagination) * [Go driver docs](../../../../build/drivers/go-client/#pagination) * [.NET/C# driver docs](../../../../build/drivers/dotnet-client/#pagination) * [JVM driver docs](../../../../build/drivers/jvm-client/#pagination) ## [](#document-relationships)Document relationships ### [](#create-a-document-relationship)Create a document relationship To create a document relationship, include a document reference as a field value: ```fql // References a `Category` collection document. let produce = Category.byName("produce").first() // Creates a `Product` collection document. Product.create({ name: "lemons", description: "Organic, 16 ct", price: 2_49, stock: 200, // Adds the previous `Category` document reference as // a `category` field value. category: produce }) ``` | See Create a document relationship | | --- | --- | --- | ### [](#resolve-document-relationships)Resolve document relationships You can project a field that contains a document to dynamically resolve the document reference: ```fql let produce = Category.byName("produce").first() // Projects the `name`, `description`, `price`, // and `category` fields. Product.byCategory(produce) { name, description, price, category } ``` ``` { data: [ { name: "avocados", description: "Conventional Hass, 4ct bag", price: 399, // Resolves the `Category` collection document in // the `category` field. category: { id: "789", coll: Category, ts: Time("2099-07-29T21:18:48.680Z"), products: "hdW...", name: "produce", description: "Fresh Produce" } }, { name: "single lime", description: "Conventional, 1 ct", price: 35, category: { id: "789", coll: Category, ts: Time("2099-07-29T21:18:48.680Z"), products: "hdW...", name: "produce", description: "Fresh Produce" } }, ... ] } ``` | See Resolve document relationships with projection | | --- | --- | --- | ### [](#delete-document-relationships)Delete document relationships To delete a document relationship, remove the field that contains the document reference. Removing the field does not delete the referenced document. For example: ```fql // Updates a `Product` collection document. Product.byId("111")?.update({ // Removes the `category` field, which contains a // reference to a `Category` collection document. // Removing the `category` field does not delete // the `Category` document. category: null }) ``` ### [](#dangling-references)Dangling references Deleting a document does not remove its inbound [document references](../../../data-model/relationships/). Documents may contain [references](../../../data-model/relationships/) to [Nulldocs](../../../../reference/fql/types/#nulldoc) — documents that don’t exist. These are called dangling references. For example: ```fql // Gets a `Product` collection document. // Use projection to return `name`, `description`, and `category` fields. Product.byId("111") { name, description, // The `category` field contains a reference to a `Category` collection document. category } ``` ``` { name: "cups", description: "Translucent 9 Oz, 100 ct", // If the referenced `Category` collection document doesn't exist, // the projection returns a NullDoc. category: Category("123") /* not found */ } ``` ### [](#perform-a-cascading-delete)Perform a cascading delete A cascading delete is an operation where deleting a document in one collection automatically deletes related documents in other collections. Fauna doesn’t provide automatic cascading deletes for user-defined collections. Instead, you can use an index and [`set.forEach()`](../../../../reference/fql-api/set/foreach/) to iterate through a document’s relationships. In the following example, you’ll delete a `Category` collection document and any `Product` documents that reference the category. 1. Define an index as part of a [collection schema](../../../schema/): ```fsl collection Product { ... category: Ref ... // Defines the `byCategory()` index. // Use the index to get `Product` collection // documents by `category` value. In this case, // `category` contains a reference to a `Category` collection document. index byCategory { terms [.category] } } ``` 2. Use the index and [`set.forEach()`](../../../../reference/fql-api/set/foreach/) to delete the category and any related products: ```fql // Gets a `Category` collection document. let category = Category.byId("333") // Gets `Product` collection documents that // contain the `Category` document in the `category` field. let products = Product.byCategory(category) // Deletes the `Category` collection document. category?.delete() // Deletes `Product` collection documents that // contain the `Category` document in the `category` field. products.forEach(.delete()) // Returns `null` ``` ## [](#child-databases)Child databases ### [](#create-a-child-database)Create a child database ```fql // Creates the `childDB` child database. Database.create({ name: "childDB", // Enables typechecking for the database. typechecked: true }) ``` | Reference: Database.create() | | --- | --- | --- | ### [](#get-a-child-database)Get a child database ```fql // Gets the `childDB` child database. Database.byName("childDB") ``` | Reference: Database.byName() | | --- | --- | --- | ### [](#update-a-child-database)Update a child database ```fql // Updates the `childDB` child database's // `typechecked` field. Database.byName("childDB")?.update({typechecked: false}) ``` | Reference: database.update() | | --- | --- | --- | ### [](#delete-a-child-database)Delete a child database ```fql // Deletes the `childDB` child database. Database.byName("childDB")?.delete() ``` | Reference: database.delete() | | --- | --- | --- | # Bulk writes This guide covers common patterns for bulk writes in FQL. You use bulk writes to create, update, or delete multiple collection documents in a single query. ## [](#create-multiple-documents)Create multiple documents Use the [`array.forEach()`](../../../../reference/fql-api/array/foreach/) and [`collection.create()`](../../../../reference/fql-api/collection/instance-create/) to create a collection document for each element of an object Array: ```fql // Create an Array of objects that contain document data. let customers = [ { "name": "Ruby Von Rails", "email": "ruby@example.com", "address": { "street": "87856 Mendota Court", "city": "Washington", "state": "DC", "postalCode": "20220", "country": "US" } }, { "name": "Scott Chegg", "email": "chegg@example.com", "address": { "street": "87856 Mendota Court", "city": "Washington", "state": "DC", "postalCode": "20220", "country": "US" } }, { "name": "Hilary Ouse", "email": "ouse@example.com", "address": { "street": "87856 Mendota Court", "city": "Washington", "state": "DC", "postalCode": "20220", "country": "US" } } ] // Use `forEach()` to create a `Customer` collection document for each // element of the previous Array. customers.forEach(doc => Customer.create({ doc })) // `forEach()` returns `null`. ``` | Reference: array.forEach(), collection.create() | | --- | --- | --- | ## [](#edit-multiple-documents)Edit multiple documents Use [`set.forEach()`](../../../../reference/fql-api/set/foreach/) and [`document.update()`](../../../../reference/fql-api/document/update/) to iteratively update each document in a Set: ```fql // Get a Set of `Customer` collection documents with an // `address` in the `state` of `DC`. let customers = Customer.where( .address?.state == "DC" ) // Use `forEach()` to update each document in the previous Set. customers.forEach(doc => doc.update({ address: { street: doc?.address?.street, city: doc?.address?.city, state: "District of Columbia", postalCode: doc?.address?.postalCode, country: doc?.address?.country, } })) // `forEach()` returns `null`. ``` | Reference: set.forEach(), document.update() | | --- | --- | --- | ## [](#upsert-multiple-documents)Upsert multiple documents Use an [`if... else`](../../../../reference/fql/statements/#if) expression to upsert documents. An upsert conditionally: * Creates a document if a document with the specified key doesn’t exist. * Updates an existing document if a document with the key already exists. The following query uses an [`if... else`](../../../../reference/fql/statements/#if) expression with [`array.forEach()`](../../../../reference/fql-api/array/foreach/), [`collection.create()`](../../../../reference/fql-api/collection/instance-create/), and [`document.update()`](../../../../reference/fql-api/document/update/). ```fql // Create an Array of customer data to upsert. let customersToUpsert = [ { "name": "Ruby Von Rails", "email": "ruby@example.com", "address": { "street": "123 Coding Lane", "city": "Programmington", "state": "CA", "postalCode": "90210", "country": "US" } }, { "name": "Scott Chegg", "email": "chegg@example.com", "address": { "street": "456 Database Drive", "city": "Queryville", "state": "NY", "postalCode": "10001", "country": "US" } }, { "name": "Hilary Ouse", "email": "ouse@example.com", "address": { "street": "789 Algorithm Avenue", "city": "Looptown", "state": "TX", "postalCode": "75001", "country": "US" } } ] // Define a function to upsert each customer. let upsertCustomer = (customerData) => { // Get each existing customer's by their email address. // The `Customer` collection contains a unique constraint // that enforces unique `email` field values. let existingCustomer = Customer.byEmail(customerData.email).first() if (existingCustomer == null) { // Create a new customer if not found. Customer.create(customerData) } else { // If found, update the existing customer. existingCustomer!.update(customerData) } } // Use `forEach()` to update each document in the previous Set. customersToUpsert.forEach(customer => upsertCustomer(customer)) // `forEach()` returns `null`. ``` | Reference: set.forEach(), collection.create(), document.update() | | --- | --- | --- | ## [](#delete-multiple-documents)Delete multiple documents Use [`set.forEach()`](../../../../reference/fql-api/set/foreach/) and [`document.delete()`](../../../../reference/fql-api/document/delete/) to iteratively delete documents in a Set: ```fql // Get a Set of `Customer` collection documents with an // `address` in the `state` of `DC`. let customers = Customer.where( .address?.state == "DC" ) // Use `forEach()` to delete each document in the previous Set. customers.forEach(doc => doc.delete()) // `forEach()` returns `null`. ``` | Reference: set.forEach(), document.delete() | | --- | --- | --- | ## [](#paginate-bulk-writes)Paginate bulk writes Queries are subject to [size limits](../../../../reference/requirements-limits/#glimits). If you’re performing bulk writes on a large dataset, you can use [`set.pageSize()`](../../../../reference/fql-api/set/pagesize/) and [`set.paginate()`](../../../../reference/fql-api/set/paginate/) to perform the write over several queries instead of one. ```fql // Get a Set of `Customer` collection documents with an // `address` in the `state` of `DC`. Use `pageSize()` // and`paginate()` to paginate results and // limit each page to two documents. let page = Customer.where( .address?.state == "DC" ) .pageSize(2).paginate() // `paginate()` returns an object. The object's `data` property // contains an Array of `Customer` documents. let data = page.data // Use `forEach()` to update each `Customer` document in the // `data` Array. data.forEach(doc => doc.update({ address: { street: doc?.address?.street, city: doc?.address?.city, state: "District of Columbia", postalCode: doc?.address?.postalCode, country: doc?.address?.country, } })) // Project the `after` cursor returned by `paginate()`. // You can use the cursor to iterate through the remaining // pages. page { after } ``` The query returns an `after` cursor: ``` { after: "hdWDxoq..." } ``` Subsequent queries use the cursor and [`Set.paginate()`](../../../../reference/fql-api/set/static-paginate/) to iterate through the remaining pages: ```fql // Uses `Set.paginate()` to iterate through pages. let page = Set.paginate("hdWDxoq...") let data = page.data data.forEach(doc => doc.update({ address: { street: doc?.address?.street, city: doc?.address?.city, state: "District of Columbia", postalCode: doc?.address?.postalCode, country: doc?.address?.country, } })) page { after } ``` | See Pagination | | --- | --- | --- | # Check a secret’s user-defined roles | Learn: Roles | | --- | --- | --- | This guide covers how to check the [user-defined roles](../../../security/roles/) assigned to a Fauna authentication [secret](../../../security/authentication/#secrets) using a user-defined collection and [user-defined functions (UDFs)](../../../schema/user-defined-functions/). 1. Create a [collection schema](../../../../reference/fsl/collection/) for the `RoleCheck` collection: ```fsl // Defines the `RoleCheck` collection. // The collection will contain a document for each // user-defined role. collection RoleCheck { // Defines the `byName()` index. // Use the index to get `RoleCheck` collection documents by // their `name`. index byName { terms [.name] } } ``` 2. Create [function schema](../../../../reference/fsl/function/) for the `currentRoles` and `hasRole` UDFs: ```fsl // Defines the `currentRoles()` UDF. // Return an Array of user-defined roles. // Ex: ["customer", "manager"] function currentRoles() { RoleCheck.all().map(.name).toArray() } // Defines the `hasRole()` UDF. // Takes a role as an argument. Returns `true` or `false`. function hasRole(role) { RoleCheck.byName(role) != null } ``` Commit the schema to Fauna using a [staged schema change](../../../schema/manage-schema/#staged). 3. Run the following FQL query with a secret that uses the built-in `admin` or `server` roles. The query populates the `RoleCheck` collection with a document for each user-defined role. ```fql // Gets all user-defined roles as an Array. let roles = Role.all().toArray() // Creates a `RoleCheck` collection document for // each role in the previous Array. roles.map(role => { RoleCheck.create({ name: role.name }) }) ``` 4. Run the following FQL query to add privileges for the new collection and UDFs to existing user-defined roles. ```fql // Gets all user-defined roles as an Array. let roles = Role.all().toArray() // Adds privileges for the new collection and UDFs to each role // in the previous Array. roles.map(role => { let newPrivileges = role.privileges.concat([ { resource: "RoleCheck", actions: { read: "doc => doc.name == '#{role.name}'" } }, { resource: "hasRole", actions: { call: true } }, { resource: "currentRoles", actions: { call: true } } ]) role.update({ privileges: newPrivileges }) }) ``` 5. Run the `HasRole()` and `CurrentRoles()` in FQL queries using secrets assigned to various roles: ```fql // Secret with the built-in `admin` role. currentRoles() // ["customer", "manager"] hasRole("customer") // true hasRole("manager") // true // Secret with the user-defined `customer` role. currentRoles() // ["customer"] hasRole("customer") // true hasRole("manager") // false // Secret with the user-defined `manager` role. currentRoles() // ["manager"] hasRole("customer") // false hasRole("manager") // true ``` # Conditional operations FQL doesn’t have a ternary (conditional) operator. You can get the same result using an [`if …​ else`](../../../../reference/fql/statements/#if) statement. For example, to perform an upsert: ```fql // Customer email to look up let email = "alice.appleseed@example.com" // Customer data to upsert let data = { name: "Alice Appleseed", email: "alice.appleseed@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } } // Try to find the existing customer by email. // If the customer doesn't exist, returns `null`. let customer = Customer.byEmail(email).first() // Create or update the customer based on existence. // If customer is null, create a new customer. // Otherwise, update the existing one. if (customer == null) { Customer.create(data) } else { customer!.update(data) } ``` ## [](#null-checking)Null checking If you’re checking for a null value, you can use the [null coalescing (`??`)](../../../../reference/fql/operators/#null-coalescing) operator to return a default value when an expression is `null`. For example: ```fql // Try to find the existing customer by email. // If the customer doesn't exist, return `null`. let customer = Customer.byEmail("carol.clark@example.com")?.first() // Use the null coalescing (??) operator to return the customer's // cart. If the customer or their cart is `null`, return `"Not found"`. // In this case, the customer's cart is `null`. customer?.cart ?? "Not found" // returns "Not found" ``` # Get unique field values This guide covers how to best get unique values for a field from a Set of documents. As an example, the guide shows you how to get unique `name` field values from `Product` collection documents. An example of a `Product` document: ``` { id: "111", coll: Product, ts: Time("2099-07-30T16:03:51.840Z"), name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698, stock: 100, category: Category("123") } ``` ## [](#define-an-index-that-covers-the-field)Define an index that covers the field Add the field as an index value in an [index definition](../../../data-model/indexes/): ```fsl collection Product { ... // Defines the `sortedByPriceLowToHigh()` index. // The index includes the `name` field as an index value. // The index can include `terms` and other `values`. index sortedByPriceLowToHigh { values [.price, .name, .description, .stock] } } ``` Using an index improves performance and reduces costs by avoiding the need to read each document individually. ## [](#extract-unique-values-in-a-small-set)Extract unique values in a small Set If your Set contains fewer than 16,000 documents, you can use an FQL query to get unique values for the field. In the query: * Use [`set.map()`](../../../../reference/fql-api/set/map/) to iterate through each `Product` collection document. * Use [`set.toArray()`](../../../../reference/fql-api/set/toarray/) to convert field values to an Array. * Call [`array.distinct()`](../../../../reference/fql-api/array/distinct/) to return a deduplicated Array containing only unique elements. * Optionally, call [`array.order()`](../../../../reference/fql-api/array/order/) to sort the deduplicated Array. ```fql // Uses `map()` to return a Set of `name` field values // from `Product` collection documents. Product.sortedByPriceLowToHigh().map(doc => doc.name) // `toArray()` converts the Set to an Array. .toArray() // `distinct()` returns a deduplicated Array. .distinct() // (Optional) `order()` sorts the deduplicated Array. .order() ``` ``` [ "avocados", "cilantro", "cups", "donkey pinata", "limes", "organic limes", "pizza", "single lime", "taco pinata" ] ``` ## [](#extract-unique-values-in-a-large-set)Extract unique values in a large Set If your Set contains 16,000 or more documents, the previous query would require pagination. [`array.distinct()`](../../../../reference/fql-api/array/distinct/) would only be able to extract unique elements from each _page_ of results. Instead, it’s more efficient to retrieve all field values and process them on the client side. For example, using Fauna’s JavaScript client: ```javascript // Uses `map()` and `pageSize() to get the `name` field values // of all `Product` collection documents in batches of 500. const query = fql`Product.sortedByPriceLowToHigh() .map(doc => doc.name) .pageSize(500)` const iter = client.paginate(query) // In JavaScript, a `Set` object only stores unique values. const names = new Set() for await (const name of iter.flatten()) { names.add(name) } ``` # Geospatial search In this guide, you’ll use Fauna to run geospatial searches in a given area using a bounding box search pattern. The method uses latitude and longitude coordinates to find places in a radius of a provided location. The guide uses example locations from Toronto, Canada. ## [](#create-the-calculateboundingbox-function)Create the `calculateBoundingBox()` function To start, create a user-defined function (UDF) to calculate the minimum and maximum coordinates for the bounding box: ```fsl function calculateBoundingBox( lat: Number, lon: Number, radius: Number):{ minLat: Number, maxLat: Number, minLon: Number, maxLon: Number } { let R = 6371 let radLat = lat * Math.PI / 180 let deltaLat = radius / R let deltaLon = Math.asin(Math.sin(deltaLat) / Math.cos(radLat)) let minLat = lat - deltaLat * 180 / Math.PI let maxLat = lat + deltaLat * 180 / Math.PI let minLon = lon - deltaLon * 180 / Math.PI let maxLon = lon + deltaLon * 180 / Math.PI let cord = { minLat: minLat, maxLat: maxLat, minLon: minLon, maxLon: maxLon } cord } ``` ## [](#create-the-haversine-function)Create the `haversine()` function Next, create a `haversine()` function to calculate the distance, in kilometers, between two geographic points using their coordinates. The function uses the Haversine formula to account for the curvature of the earth. ```fsl function haversine( coords1: { latitude: Number, longitude: Number }, coords2: { latitude: Number, longitude: Number }): Number { let lat1 = coords1.latitude let lon1 = coords1.longitude let lat2 = coords2.latitude let lon2 = coords2.longitude let R = 6371 let x1 = lat2 - lat1 let dLat = x1 * Math.PI / 180 let x2 = lon2 - lon1 let dLon = x2 * Math.PI / 180 let a = Math.sin(dLat / 2) * Math.sin(dLat / 2) + Math.cos(lat1 * Math.PI / 180) * Math.cos(lat2 * Math.PI / 180) * Math.sin(dLon / 2) * Math.sin(dLon / 2) let c = 2 * Math.asin(Math.sqrt(a)); let distance = R * c distance } ``` ## [](#add-sample-data)Add sample data Create a `Location` collection: ```fsl collection Location { history_days 0 index orderByLatLong { values [.latitude, .longitude] } } ``` Add sample data for a few locations to the collection: ```fql Location.create({ name: "St. Lawrence Market", latitude: 43.6487, longitude: -79.3716 }) Location.create({ name: "Royal Ontario Museum", latitude: 43.6677, longitude: -79.3948 }) Location.create({ name: "Toronto Music Garden", latitude: 43.6366, longitude: -79.3951 }) Location.create({ name: "High Park", latitude: 43.6465, longitude: -79.4637 }) Location.create({ name: "Ontario Science Centre", latitude: 43.7163, longitude: -79.3390 }) Location.create({ name: "Casa Loma", latitude: 43.6780, longitude: -79.4094 }) Location.create({ name: "Scarborough Bluffs", latitude: 43.7064, longitude: -79.2328 }) Location.create({ name: "Black Creek Pioneer Village", latitude: 43.7735, longitude: -79.5164 }) ``` ## [](#query-inside-a-radius)Query inside a radius Run the following query to search for places in a 5 km radius of a defined location. The results only include places in the radius. Results are ordered by distance from the location. ```fql let distanceKm = 5.0 // find all points within 5 km of the current location // Define the current location let latitude = 43.6532 let longitude = -79.3832 let boundingbox = calculateBoundingBox(latitude, longitude, distanceKm) Location.orderByLatLong({ from: [boundingbox.minLat, boundingbox.minLon], to: [boundingbox.maxLat, boundingbox.maxLon] }) .where( .longitude >= boundingbox.minLon && .longitude <= boundingbox.maxLon ) .map(l => { location: l, distance: haversine( { latitude: l.latitude, longitude: l.longitude }, { latitude: latitude, longitude: longitude } ) }) .order(asc(.distance)) ``` Results: ``` { data: [ { location: { id: "396567265944797257", coll: Location, ts: Time("2099-04-30T05:27:46.260Z"), name: "St. Lawrence Market", latitude: 43.6487, longitude: -79.3716 }, distance: 1.0589651226108578 }, { location: { id: "396631857567891533", coll: Location, ts: Time("2099-04-30T22:34:25.630Z"), name: "Royal Ontario Museum", latitude: 43.6677, longitude: -79.3948 }, distance: 1.8628877543323947 }, { location: { id: "396567265945845833", coll: Location, ts: Time("2099-04-30T05:27:46.260Z"), name: "Toronto Music Garden", latitude: 43.6366, longitude: -79.3951 }, distance: 2.0794133933697307 }, { location: { id: "396567265946894409", coll: Location, ts: Time("2099-04-30T05:27:46.260Z"), name: "Casa Loma", latitude: 43.678, longitude: -79.4094 }, distance: 3.470709059384686 } ] } ``` ## [](#optimizing-the-query)Optimizing the query You can optimize the query in previous example. You can make a cheap index by chunking latitude and longitude together. Update your `Location` collection to include the following index: ```fsl collection Location { history_days 0 compute lat_floor = doc => Math.floor(doc.latitude) index by_latitude__logitude_asc { terms [.lat_floor] values [.longitude] } } ``` Update the query to use the new index: ```fql // find all points within 5 km of the current location let distanceKm = 5.0 // Define the current location let latitude = 43.6532 let longitude = -79.3832 let boundingbox = calculateBoundingBox(latitude, longitude, distanceKm) let lat_range = Set.sequence(Math.floor(boundingbox.minLat), Math.ceil(boundingbox.maxLat)) lat_range .flatMap(lat => { Location.by_latitude__logitude_asc( lat * 1.0, { from: boundingbox.minLon, to: boundingbox.maxLon } ) }) .map(l => { location: l, distance: haversine( { latitude: l.latitude, longitude: l.longitude }, { latitude: latitude, longitude: longitude } ) }) .where(.distance < distanceKm) .order(asc(.distance)) ``` Results: ``` { data: [ { location: { id: "396567265944797257", coll: Location, ts: Time("2099-04-30T05:27:46.260Z"), lat_floor: 43.0, name: "St. Lawrence Market", latitude: 43.6487, longitude: -79.3716 }, distance: 1.0589651226108578 }, { location: { id: "396631857567891533", coll: Location, ts: Time("2099-04-30T22:34:25.630Z"), lat_floor: 43.0, name: "Royal Ontario Museum", latitude: 43.6677, longitude: -79.3948 }, distance: 1.8628877543323947 }, { location: { id: "396567265945845833", coll: Location, ts: Time("2099-04-30T05:27:46.260Z"), lat_floor: 43.0, name: "Toronto Music Garden", latitude: 43.6366, longitude: -79.3951 }, distance: 2.0794133933697307 }, { location: { id: "396567265946894409", coll: Location, ts: Time("2099-04-30T05:27:46.260Z"), lat_floor: 43.0, name: "Casa Loma", latitude: 43.678, longitude: -79.4094 }, distance: 3.470709059384686 } ] } ``` # Group By: Aggregate data in Fauna SQL’s `GROUP BY` operation lets you organize rows into groups based on the value of specific columns. `GROUP BY` is commonly used to aggregate data. For example, you can use `GROUP BY` to: * Calculate total sales by category * Count customers by region * Find average prices by product category. ## [](#create-a-groupby-function)Create a `groupBy()` function [FQL](../../) doesn’t provide a built-in `GROUP BY` operation. However, you use [`array.fold()`](../../../../reference/fql-api/array/fold/) and [`set.fold()`](../../../../reference/fql-api/set/fold/) in an [FQL function](../../../../reference/fql/functions/) or a [user-defined function (UDF)](../../../schema/user-defined-functions/) to achieve the same result. As an FQL function: ```fql // Defines an anonymous `groupBy()` function. // `groupBy()` two arguments: // * `set`: Set or Array containing data to group // * `key_fn`: Grouping key for each element let groupBy = (set, key_fn) => { // Calls the `fold()` function on the `set` // Set or Array. set.fold( // Start with an empty object. {}, (acc, val) => { // For each value, get a key using the `key_fn` arg. let key = key_fn(val) let existing_group = acc[key] ?? [] // Append the current value to the Set or // Array for that key. let new_group = existing_group.append(val) let new_entry = Object.fromEntries([ [key, new_group] ]) // Return an object with grouped results. Object.assign(acc, new_entry) } ) } // Call the `groupBy()` function. // Groups `Product` documents by category name. groupBy(Product.all(), .category!.name) ``` You can also define a `groupBy()` UDF. This lets you reuse the function across multiple queries. You create and manage a UDF as an FSL [function schema](../../../../reference/fsl/function/): ```fsl // Defines the `groupBy()` UDF. // `groupBy()` two arguments: // * `set`: Set or Array containing data to group // * `key_fn`: Grouping key for each element function groupBy (set, key_fn) { // Calls the `fold()` function on the `set` // Set or Array. set.fold( // Start with an empty object. {}, (acc, val) => { // For each value, get a key using the `key_fn` arg. let key: String = key_fn(val) let existing_group = acc[key] ?? [] // Append the current value to the Set or // Array for that key. let new_group = existing_group.append(val) let new_entry = Object.fromEntries([ [key, new_group] ]) // Return an object with grouped results. Object.assign(acc, new_entry) } ) } ``` You can create and manage schema using any of the following: * The [Fauna CLI](../../../schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../../reference/http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../schema/manage-schema/#fql) ## [](#usage-examples)Usage examples This section contains examples using the previously defined `groupBy()` UDF. ### [](#group-numbers-by-range)Group numbers by range ```fql let numbers = [1, 2, 3, 4, 5, 6, 7, 8, 9] groupBy(numbers, i => if(i < 5) { "low" } else { "high" }) ``` ``` { low: [ 1, 2, 3, 4 ], high: [ 5, 6, 7, 8, 9 ] } ``` ### [](#group-objects-by-property)Group objects by property ```fql let items = [ { val: 1 }, { val: 2 }, { val: 3 } ] groupBy(items, .val.toString()) ``` ``` { "1": [ { val: 1 } ], "2": [ { val: 2 } ], "3": [ { val: 3 } ] } ``` ### [](#group-query-results)Group query results ```fql // Get the `frozen` and `produce` categories. let frozen = Category.byName("frozen").first() let produce = Category.byName("produce").first() // Get products for the `frozen` and `produce` // categories. let items = ( Product.byCategory(frozen) .concat(Product.byCategory(produce)) { id, category: .category!.name }) .take(5) .toArray() // Group products by category name. groupBy(items, .category) ``` ``` { frozen: [ { id: "333", category: "frozen" } ], produce: [ { id: "444", category: "produce" }, { id: "555", category: "produce" }, { id: "666", category: "produce" }, { id: "777", category: "produce" } ] } ``` # String search You can use string searches to fetch documents based on a matching string or substring. This guide covers common patterns for string searches in FQL queries. ## [](#exact-match-search)Exact match search You can use [`collection.where()`](../../../../reference/fql-api/collection/instance-where/) with an equality comparison to run exact match searches on [String](../../../../reference/fql/types/#string) fields: ```fql Product.where(.name == "cups") ``` Results: ``` { data: [ { id: "111", coll: Product, ts: Time("2099-07-30T16:03:51.840Z"), name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698, stock: 100, category: Category("123") } ] } ``` Equality comparisons are case-sensitive. Use [`string.toLowerCase()`](../../../../reference/fql-api/string/tolowercase/) to make the search case-insensitive: ```fql Product.where(.name.toLowerCase() == "CuPs".toLowerCase()) ``` ### [](#use-an-index-for-exact-match-search)Use an index for exact match search Calling [`collection.where()`](../../../../reference/fql-api/collection/instance-where/) directly on a collection requires a scan of the entire collection. It isn’t performant for large collections. Instead, use an index with a term to run an exact match search: ```fql Product.byName("cups") ``` `name` is the only term for the `byName()` index. The search is case-sensitive. ## [](#prefix-search)Prefix search Use [`string.startsWith()`](../../../../reference/fql-api/string/startswith/) to match strings that begin with a specific substring. [`string.startsWith()`](../../../../reference/fql-api/string/startswith/) is case-sensitive. ```fql Customer.where(.name.startsWith("Al")) ``` ``` { data: [ { id: "111", coll: Customer, ts: Time("2099-07-30T16:03:51.840Z"), cart: Order("412483941752112205"), orders: "hdW...", name: "Alice Appleseed", email: "alice.appleseed@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } } ] } ``` ## [](#suffix-search)Suffix search Use [`string.endsWith()`](../../../../reference/fql-api/string/endswith/) to match strings that end with a specific substring. [`string.endsWith()`](../../../../reference/fql-api/string/endswith/) is case-sensitive. ```fql Customer.where(.name.endsWith("Appleseed")) ``` Results: ``` { data: [ { id: "111", coll: Customer, ts: Time("2099-07-30T16:03:51.840Z"), cart: Order("412483941752112205"), orders: "hdW...", name: "Alice Appleseed", email: "alice.appleseed@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } } ] } ``` ## [](#substring-search)Substring search Use [`string.includes()`](../../../../reference/fql-api/string/includes/) to match strings that contain a specific substring. [`string.includes()`](../../../../reference/fql-api/string/includes/) is case-sensitive. ```fql Product.where(.description.includes("16 oz bag")) ``` Results: ``` { data: [ { id: "666", coll: Product, ts: Time("2099-07-30T16:03:51.840Z"), name: "organic limes", description: "Organic, 16 oz bag", price: 349, stock: 50, category: Category("789") }, { id: "777", coll: Product, ts: Time("2099-07-30T16:03:51.840Z"), name: "limes", description: "Conventional, 16 oz bag", price: 299, stock: 30, category: Category("789") } ] } ``` ## [](#regex-search)Regex search Use [`string.includesRegex()`](../../../../reference/fql-api/string/includesregex/) to check for substrings matching a regular expression: ```fql // Regex for 5-digit postal codes that begin with `2`. let regex = "(2[0-9]{4})" Customer.where(.address.postalCode.includesRegex(regex)) ``` Results: ``` { data: [ { id: "111", coll: Customer, ts: Time("2099-07-30T16:03:51.840Z"), cart: Order("412483941752112205"), orders: "hdW...", name: "Alice Appleseed", email: "alice.appleseed@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } }, ... ] } ``` ## [](#return-matching-substrings)Return matching substrings [`string.matches()`](../../../../reference/fql-api/string/matches/) is similar to [`string.includes()`](../../../../reference/fql-api/string/includes/) and [`string.includesRegex()`](../../../../reference/fql-api/string/includesregex/) except it returns an Array of matching substrings. You can pass [`string.matches()`](../../../../reference/fql-api/string/matches/) a substring: ```fql let name = "Denny Johnson" name.matches('Johnson') ``` ``` [ "Johnson" ] ``` [`string.matches()`](../../../../reference/fql-api/string/matches/) also accepts a regular expression: ```fql let phone = "416-695-4364" let regex = "(?:(1)?)?[-.●]?([0-9]{3})[-.●]?([0-9]{3})[-.●]?([0-9]{4})" phone.matches(regex) ``` ``` [ "416-695-4364" ] ``` # Work with dates and times | Reference: FQL Date docs, FQL Time docs | | --- | --- | --- | [FQL](../../) includes types for [date](../../../../reference/fql/types/#date) and [time](../../../../reference/fql/types/#time) values. This guide covers how to use FQL’s built-in methods for date and time values to: * Convert strings to dates and times (and the reverse) * Convert Unix timestamps to times (and the reverse) * Perform date-time math while automatically converting time units, accounting for variable month lengths, leap days, and more. * Sort document Sets on date and time values. * Run range searches on dates and times in document Sets. ## [](#dates)Dates Use FQL’s [Date](../../../../reference/fql-api/date/) methods to create and manipulate [date](../../../../reference/fql/types/#date) values. Date values are in UTC and don’t include a time component. ### [](#get-the-current-date)Get the current date Use [`Date.today()`](../../../../reference/fql-api/date/today/) to get the current date as a date value: ```fql Date.today() ``` ### [](#convert-a-string-to-a-date)Convert a string to a date Use [`Date()`](../../../../reference/fql-api/date/date/) to convert a `YYYY-MM-DD` [string](../../../../reference/fql/types/#string) to a date: ```fql Date("2099-10-20") ``` [`Date.fromString()`](../../../../reference/fql-api/date/fromstring/) is equivalent to [`Date()`](../../../../reference/fql-api/date/date/): ```fql // Equivalent to Date("2099-10-20") Date.fromString("2099-10-20") ``` ### [](#convert-a-time-to-a-date)Convert a time to a date FQL doesn’t include a built-in method for converting a [time](../../../../reference/fql/types/#time) to date. However, you can use an [FQL function](../../../../reference/fql/functions/) or [user-defined functions (UDFs)](../../../schema/user-defined-functions/) to convert a time to a date. As an FQL function: ```fql // Accepts a time value. let timeToDate: (Time) => Date = (time) => { // Convert the time to an ISO 8601 string. let timeStr = time.toString() // Remove the time component from the string. let dateStr = timeStr.split("T")[0] // Convert the string to a date value. Date(dateStr) } // Convert the current date to a time. timeToDate(Time.now()) ``` As a UDF: ```fsl function timeToDate(time: Time): Date { // Convert the time to an ISO 8601 string. let timeStr = time.toString() // Remove the time component from the string. let dateStr = timeStr.split("T")[0] // Convert the string to a date value. Date(dateStr) } ``` ### [](#convert-a-date-to-a-string)Convert a date to a string Use [`date.toString()`](../../../../reference/fql-api/date/tostring/) to convert a date value to a `YYYY-MM-DD` string: ```fql // Get the current date as an IS0 8601 string. Date.today().toString() ``` ### [](#add-to-a-date)Add to a date Use [`date.add()`](../../../../reference/fql-api/date/add/) to add days to a date: ```fql // Adds 5 days to the current date. Date.today().add(5, 'days') ``` ### [](#subtract-from-a-date)Subtract from a date Use [`date.subtract()`](../../../../reference/fql-api/date/subtract/) to subtract days from a date: ```fql // Subtract 2 days from the current date. Date.today().subtract(2, 'days') ``` ### [](#get-the-difference-between-two-dates)Get the difference between two dates Use [`date.difference()`](../../../../reference/fql-api/date/difference/) to get the difference between two dates in days: ```fql Date('2099-02-10').difference(Date('2099-01-01')) ``` ### [](#compare-two-dates)Compare two dates You can use [comparison operators](../../../../reference/fql/operators/#comparison) to compare date values: ```fql Date('2099-02-10') > Date('2099-02-09') ``` ### [](#sort-a-set-on-a-date-value)Sort a Set on a date value You can use [`set.order()`](../../../../reference/fql-api/set/order/) to sort a Set by a date value: ```fql // Sorts `Order` documents by descending // `createdAtDate` date value. Order.all().order(desc(.createdAtDate)) ``` For better performance with large Sets, [define an index](../../../data-model/indexes/#define-an-index) that includes the date field as an index value: ```fsl collection Order { ... // Defines the `sortedByCreatedAtDateHighToLow()` index. index sortedByCreatedAtDateHighToLow { // `values` are document fields for sorting and range searches. // In this example, you sort or filter index results by their // descending `createdAtDate` field value. values [desc(.createdAtDate)] } ... } ``` Then call the index to return the Set sorted by the value: ```fql // Sorts `Order` documents by descending // `createdAtDate` date value. Order.sortedByCreatedAtDateHighToLow() ``` ### [](#run-a-ranged-search-on-date-values)Run a ranged search on date values You can use [`set.where()`](../../../../reference/fql-api/set/where/) to filter a Set on a range of date values: ```fql // Gets `Order` documents with a `createdAtDate` date: // - Less than or equal to the current date. // - Greater than or equal to the current date minus 5 days. Order.all() .where(.createdAtDate <= Date.today()) .where(.createdAtDate >= Date.today().subtract(5, "days")) ``` For better performance with large Sets, [define an index](../../../data-model/indexes/#define-an-index) that includes the date field as an index value: ```fsl collection Order { createdAtDate: Date ... // Defines the `sortedByCreatedAtDateHighToLow()` index. index sortedByCreatedAtDateHighToLow { // `values` are document fields for sorting and range searches. // In this example, you sort or filter index results by their // descending `createdAtDate` field value. values [desc(.createdAtDate)] } ... } ``` Then use the index to [run a range search](../../../data-model/indexes/#range-search) on the index value: ```fql // Gets `Order` documents with a `createdAtDate` date: // - Less than or equal to the current date. // - Greater than or equal to the current date minus 5 days. Order.sortedByCreatedAtDateHighToLow({ from: Date.today(), to: Date.today().subtract(5, "days") }) ``` ## [](#times)Times Use FQL’s [Time](../../../../reference/fql-api/time/) methods to create and manipulate [time](../../../../reference/fql/types/#time) values. All time values are in UTC. ### [](#get-the-current-time)Get the current time Use [`Time.now()`](../../../../reference/fql-api/time/now/) to get the current time as a time value: ```fql Time.now() ``` ### [](#convert-an-iso-8601-string-to-a-time)Convert an ISO 8601 string to a time Use [`Time()`](../../../../reference/fql-api/time/time/) to convert an ISO 8601 [string](../../../../reference/fql/types/#string) to a time: ```fql Time("2099-10-20T21:15:09.890729Z") ``` [`Time.fromString()`](../../../../reference/fql-api/time/fromstring/) is equivalent to [`Time()`](../../../../reference/fql-api/time/time/): ```fql // Equivalent to Time("2099-10-20T21:15:09.890729Z") Time.fromString("2099-10-20T21:15:09.890729Z") ``` ### [](#convert-a-unix-timestamp-to-a-time)Convert a Unix timestamp to a time Use [`Time.epoch()`](../../../../reference/fql-api/time/epoch/) to convert a Unix timestamp to a time. [`Time.epoch()`](../../../../reference/fql-api/time/epoch/) requires the time increment used to offset the Unix epoch: ```fql Time.epoch(1676030400, 'seconds') ``` ### [](#convert-a-date-to-a-time)Convert a date to a time FQL doesn’t include a built-in method for converting [dates](../../../../reference/fql/types/#date) to times. However, you can use an [FQL function](../../../../reference/fql/functions/) or [user-defined functions (UDFs)](../../../schema/user-defined-functions/) to convert dates to times. As a function: ```fql // Accepts a date value. let dateToTime: (Date) => Time = (date) => { // Convert the date to a string. let dateStr = date.toString() // Append a time component to the IS0 8601 string. // In this example, use midnight. let timeStr = dateStr + "T00:00:00Z" // Convert the ISO 8601 string to a time value. Time(timeStr) } // Convert the current date to a time. dateToTime(Date.today()) ``` As a UDF: ```fsl function dateToTime(date: Date): Time { // Convert the date to a string. let dateStr = date.toString() // Append a time component to the string. // In this example, use midnight. let timeStr = dateStr + "T00:00:00Z" // Convert the ISO 8601 string to a time value. Time(timeStr) } ``` ### [](#convert-a-time-to-an-iso-8601-string)Convert a time to an ISO 8601 string Use [`time.toString()`](../../../../reference/fql-api/time/tostring/) to convert a time to an ISO 8601 string: ```fql // Get the current time as an IS0 8601 string. Time.now().toString() ``` ### [](#convert-a-time-to-a-unix-epoch)Convert a time to a Unix epoch Use [`time.toMicros()`](../../../../reference/fql-api/time/tomicros/), [`time.toMillis()`](../../../../reference/fql-api/time/tomillis/), and [`time.toSeconds()`](../../../../reference/fql-api/time/toseconds/) to convert a time to a Unix timestamp, offset using the respective time unit: ```fql // Get the current time as a Unix timestamp // in seconds since the Unix epoch. Time.now().toSeconds() ``` ### [](#add-to-a-time)Add to a time Use [`time.add()`](../../../../reference/fql-api/time/add/) to add time increments to a time: ```fql // Adds 20 minutes to the current time. Time.now().add(20, 'minutes') ``` ### [](#subtract-from-a-time)Subtract from a time Use [`time.subtract()`](../../../../reference/fql-api/time/subtract/) to subtract time increments from a time: ```fql // Subtract 5 seconds from the current time. Time.now().subtract(5, 'seconds') ``` ### [](#get-the-difference-between-two-times)Get the difference between two times Use [`time.difference()`](../../../../reference/fql-api/time/difference/) to get the difference between two times in a time unit you choose. ```fql Time('2099-02-10T12:10:00.000Z') .difference(Time('2099-02-10T12:00:00.000Z'), 'minutes') ``` ### [](#compare-two-times)Compare two times You can use [comparison operators](../../../../reference/fql/operators/#comparison) to compare time values: ```fql Time('2099-02-10T12:10:00.000Z') > Time('2099-02-10T12:00:00.000Z') ``` ### [](#sort-a-set-on-a-time-value)Sort a Set on a time value You can use [`set.order()`](../../../../reference/fql-api/set/order/) to sort a Set by a time value: ```fql // Sorts `Order` documents by descending // `createdAt` time value. Order.all().order(desc(.createdAt)) ``` For better performance with large Sets, [define an index](../../../data-model/indexes/#define-an-index) that includes the time field as an index value: ```fsl collection Order { ... // Defines the `sortedByCreatedAtHighToLow()` index. index sortedByCreatedAtHighToLow { // `values` are document fields for sorting and range searches. // In this example, you sort or filter index results by their // descending `createdAt` field value. values [desc(.createdAt)] } ... } ``` Then call the index to return the Set sorted by the value: ```fql // Sorts `Order` documents by descending // `createdAt` time value. Order.sortedByCreatedAtHighToLow() ``` ### [](#run-a-range-search-on-time-values)Run a range search on time values You can use [`set.where()`](../../../../reference/fql-api/set/where/) to filter a Set by a range of time values: ```fql // Gets `Order` documents with a `createdAt` time: // - Less than or equal to the current time. // - Greater than or equal to the current time minus 5 days. Order.all() .where(.createdAt <= Time.now()) .where(.createdAt >= Time.now().subtract(5, "days")) ``` For better performance with large Sets, [define an index](../../../data-model/indexes/#define-an-index) that includes the time field as an index value: ```fsl collection Order { createdAt: Time ... // Defines the `sortedByCreatedAtHighToLow()` index. index sortedByCreatedAtHighToLow { // `values` are document fields for sorting and range searches. // In this example, you sort or filter index results by their // descending `createdAt` field value. values [desc(.createdAt)] } ... } ``` Then use the index to [run a range search](../../../data-model/indexes/#range-search) on the index value: ```fql // Gets `Order` documents with a `createdAt` time: // - Less than or equal to the current time. // - Greater than or equal to the current time minus 5 days. Order.sortedByCreatedAtHighToLow({ from: Time.now(), to: Time.now().subtract(5, "days") }) ``` ### [](#get-a-documents-timestamp)Get a document’s timestamp All documents include a [`ts` (timestamp) metadata field](../../../data-model/documents/#meta) that records the document’s last write. Fauna stores the field as a read-only [Time](../../../../reference/fql/types/#time) value. You can read the `ts` field like any other document field: ```fql // Projects a `Product` document's `ts` field. Product.byName("limes").first() { ts } ``` # Work with multiple Sets This guide covers common patterns for performing SQL-like [set operations](https://en.wikipedia.org/wiki/Set_operations_\(SQL\)), such as unions, joins, and intersections, in FQL. You can use these operations to combine and filter Sets of collection documents. ## [](#unions)Unions Use [`set.concat()`](../../../../reference/fql-api/set/concat/) to combine two Sets into a single Set. This is similar to a `UNION` clause in SQL. ```fql // Setup: Get frozen and produce `Category` documents. let frozenCat = Category.byName("frozen").first() let produceCat = Category.byName("produce").first() // Get Sets of related `Product` documents for each category. let frozenDocs = Product.byCategory(frozenCat) let produceDocs = Product.byCategory(produceCat) // Use `concat()` to combine the previous // `Product` document Sets. Project each // product's name and category. frozenDocs.concat(produceDocs) { name: .name, category: .category!.name } ``` ``` // The combined Set contains all `Product` documents // in the frozen and produce categories. { data: [ { name: "pizza", category: "frozen" }, ... { name: "cilantro", category: "produce" } ] } ``` ### [](#chain-set-concat)Chain `set.concat()` You can chain [`set.concat()`](../../../../reference/fql-api/set/concat/) to combine three or more Sets. The following example expands on the previous one: ```fql // Setup: Get frozen, produce, and party `Category` documents. let frozenCat = Category.byName("frozen").first() let produceCat = Category.byName("produce").first() let partyCat = Category.byName("party").first() // Get Sets of related `Product` documents for each category. let frozenDocs = Product.byCategory(frozenCat) let produceDocs = Product.byCategory(produceCat) let partyDocs = Product.byCategory(partyCat) // Chain `concat()` expressions to combine the previous // `Product` document Sets. Project each // product's name and category. frozenDocs .concat(produceDocs) .concat(partyDocs) { name: .name, category: .category!.name } ``` ``` // The combined Set contains all `Product` documents // in the frozen, produce, and party categories. { data: [ { name: "pizza", category: "frozen" }, { name: "avocados", category: "produce" }, ... { name: "taco pinata", category: "party" } ] } ``` ### [](#use-set-flatmap)Use `set.flatMap()` When combining several Sets, you can improve readability by using [`set.flatMap()`](../../../../reference/fql-api/set/flatmap/) to iterate through the Sets: ```fql // Start with an Array of category names. ["frozen", "produce", "party"] // Convert the Array to a Set. `flatMap()` operates on Sets. .toSet() // Use `flatMap()` to process each category name // and combine the resulting Sets. `flatMap()` will: // 1. Apply the given function to each element of the Set. // 2. Collect the results in a Set, // with a nested child Set for each element. // 3. Flatten the resulting Set by one level, // resulting in a single, flat Set. .flatMap(name => { // Get the `Category` document for each category name. let category = Category.byName(name).first() // Get Sets of related `Product` documents for each category. // Each Set is included in the flattened results, creating // a single Set of products across all listed categories. // Project each product's name and category. Product.byCategory(category) }) { name: .name, category: .category!.name } ``` ``` // The combined Set contains all `Product` documents // in the listed categories. { data: [ { name: "pizza", category: "frozen" }, { name: "avocados", category: "produce" }, ... { name: "taco pinata", category: "party" } ] } ``` ## [](#joins)Joins In Fauna, you can use [document references](../../../data-model/relationships/) to model relationships between documents in different collections. You can use [projection](../../../../reference/fql/projection/) to resolve the references. This is similar to performing a `JOIN` in SQL. ```fql // Get a Set of `Order` documents with a `status` of `cart`. // In production, use an index instead of `where()` for // better performance. let orders = Order.where(.status == "cart") // Projects fields in the `Order` documents. orders { // Resolves the `Customer` document reference in // the `customer` field. customer { id, name, email } } ``` ``` { data: [ { customer: { id: "111", name: "Alice Appleseed", email: "alice.appleseed@example.com" } }, { customer: { id: "222", name: "Bob Brown", email: "bob.brown@example.com" } } ] } ``` ### [](#use-set-map-for-transformations)Use `set.map()` for transformations You can also use [`set.map()`](../../../../reference/fql-api/set/map/) to extract and transform field values from referenced documents. ```fql // Get a Set of `Order` documents with a `status` of `cart`. // In production, use an index instead of `where()` for // better performance. let orders: Any = Order.where(.status == "cart") // Uses `map()` to iterate through each document in the Set // and extract the `customer` field from each `Order` document. // Then project fields from the `customer` field's referenced // `Customer` document. orders.map(order => order.customer) { id, name, email } ``` ``` { data: [ { id: "111", name: "Alice Appleseed", email: "alice.appleseed@example.com" }, { id: "222", name: "Bob Brown", email: "bob.brown@example.com" } ] } ``` ### [](#use-set-flatmap-for-to-many-relationships)Use `set.flatMap()` for \*-to-many relationships For one-to-many or many-to-many relationships, you can use [`set.flatMap()`](../../../../reference/fql-api/set/flatmap/) to extract field values from referenced documents and flatten the resulting Set by one level: ```fql // Get a Set of all `Customer` documents. Customer.all() // Use `flatMap()` to iterate through each document in the Set // and extract fields from: // - The `Order` document referenced in the `order` field. // - The `Customer` document reference nested in the `order` field. // Then flatten the resulting Set. .flatMap(customer => { let orders = customer.orders orders { id, total, status, customer { id, name } } }) ``` ``` { data: [ { id: "408922916844994633", total: 5392, status: "cart", customer: { id: "111", name: "Alice Appleseed" } }, { id: "408922916872257609", total: 1880, status: "cart", customer: { id: "222", name: "Bob Brown" } } ] } ``` ## [](#intersections)Intersections An intersection returns elements that exist in multiple Sets, similar to an `INTERSECT` clause in SQL. ### [](#where)Filter using `where()` To perform intersections in FQL, start with the most selective expression and filter the resulting Sets using [`set.where()`](../../../../reference/fql-api/set/where/). For the best performance, use a [covered index call](../../../data-model/indexes/#covered-queries) as the first expression and only filter on covered values. ```fql // Get the produce category. let produce = Category.byName("produce").first() // Use the most selective index, `byName()`, first. // Then filter with covered values. // In this example, `category`, `price`, and `name` // should be covered by the `byName()` index. Product.byName("limes") .where(.category == produce) .where(.price < 500) { name, category: .category!.name, price } ``` Avoid filtering using `includes()` In most cases, you should avoid using [`set.includes()`](../../../../reference/fql-api/set/includes/) to intersect results, including results from [covered index calls](../../../data-model/indexes/#covered-queries). [`set.includes()`](../../../../reference/fql-api/set/includes/) is a linear operation. The compute costs consumed by repeatedly iterating through results will typically exceed the read costs of directly reading documents. For example, the following query is inefficient and will likely incur high compute costs: ```fql // Each variable is a covered index call let limes = Product.byName("limes") let produce = Product.byCategory(Category.byName("produce").first()!) let under5 = Product.sortedByPriceLowToHigh({ to: 500 }) // Uses `includes()` to intersect the results from // covered index calls limes.where(doc => produce.includes(doc)) .where(doc => under5.includes(doc)) ``` Instead, use a [covered index call](../../../data-model/indexes/#covered-queries) and [`set.where()`](../../../../reference/fql-api/set/where/) to filter the results as outlined in [Filter using `where()`](#where). For example, you can rewrite the previous query as: ```fql // Start with a covered index call. Product.byName("limes") // Layer on filters using `where()` .where(doc => doc.category == Category.byName("produce").first()!) .where(doc => doc.price < 500 ) ``` ### [](#filters)Filter using query composition The [Fauna client drivers](../../../../build/drivers/) compose queries using FQL template strings. You can interpolate variables, including other FQL template strings, into the template strings to compose dynamic queries. You can use FQL string templates to reuse filters across queries. For example, using the [JavaScript driver](../../../../build/drivers/js-client/): ```javascript // Start with base query let query = fql`Product.byName("limes")` // Define filters to apply const filters = [ fql`.where(doc => doc.category == Category.byName("produce").first()!)`, fql`.where(doc => doc.price < 500)` ] // Compose the final query filters.forEach(filter => { query = fql`${query}${filter}` }) ``` For more complex use cases, you can build a priority map of [indexes](../../../data-model/indexes/) and filters to automatically select the most performant approach: ```javascript const indexMap = { by_name: (name: string) => fql`Product.byName(${name})`, by_category: (cat: string) => fql`Product.byCategory(Category.byName(${cat}).first()!)` } const filterMap = { by_name: (name: string) => fql`.where(.name == ${name})`, by_category: (cat: string) => fql`.where(.category == Category.byName(${cat}).first()!)` } ``` Template: Dynamic filtering with advanced query composition Complex applications may need to handle arbitrary combinations of search criteria. In these cases, you can use [query composition](../../composition/) to dynamically apply [indexes](../../../data-model/indexes/) and [filters](#filters) to queries. The following template uses query composition to: * Automatically select the most selective index * Apply remaining criteria as filters in priority order * Support both index-based and filter-based search patterns The template uses TypeScript and the [JavaScript driver](../../../../build/drivers/js-client/). A similar approach can be used with any [Fauna client driver](../../../../build/drivers/). ```typescript /** * A Javascript object with a sorted list of indexes or filters. * * Javascript maintains key order for objects. * Sort items in the map from most to least selective. */ type QueryMap = Record Query> /** Object to represent a search argument. * * Contains the name of the index to use and the arguments * to pass to it. * * Example: * { name: "by_name", args: ["limes"] } * { name: "range_price", args: [{ from: 100, to: 500 }] } */ type SearchTerm = { name: string args: any[] } /** * Composes a query by prioritizing the most selective index and then * applying filters. * * @param default_query - The initial query to which indexes and filters are applied. * @param index_map - A map of index names to functions that generate query components. * @param filter_map - A map of filter names to functions that generate query components. * @param search_terms - An array of search terms that specify the type and arguments * for composing the query. * @returns The composed query after applying all relevant indices and filters. */ const build_search = ( default_query: Query, index_map: QueryMap, filter_map: QueryMap, search_terms: SearchTerm[] ): Query => { const _search_terms = [...search_terms] // Initialize a default query. Used if no other indexes are applicable. let query: Query = default_query // Iterate through the index map, from most to least selective. build_index_query: for (const index_name of Object.keys( index_map )) { // Iterate through each search term to check if it matches the highest priority index. for (const search_term of _search_terms) { // If a match is found, update the query. Then remove the search term from the // list and break out of the loop. if (index_name === search_term.name) { query = index_map[search_term.name](...search_term.args) _search_terms.splice(_search_terms.indexOf(search_term), 1) break build_index_query } } } // Iterate through the filter map, from most to least selective. for (const filter_name of Object.keys(filter_map)) { // Iterate through each search term to check if it matches the highest priority filter. for (const search_term of _search_terms) { // If a match is found, update the query. Then remove the search term from the list. if (filter_name === search_term.name) { const filter = filter_map[search_term.name](...search_term.args) query = fql`${query}${filter}` _search_terms.splice(_search_terms.indexOf(search_term), 1) } } } // If there are remaining search terms, you can't build the full query. if (_search_terms.length > 0) { throw new Error("Unable to build query") } return query } ``` The following example implements the template using the [Fauna Dashboard](https://dashboard.fauna.com/)'s demo data: ```typescript // Implementation of `index_map` from the template. // Sort items in the map from most to least selective. const product_index_priority_map: QueryMap = { by_order: (id: string) => fql`Order.byId(${id})!.items.map(.product!)`, by_name: (name: string) => fql`Product.byName(${name})`, by_category: (category: string) => fql`Product.byCategory(Category.byName(${category}).first()!)`, range_price: (range: { from?: number; to?: number }) => fql`Product.sortedByPriceLowToHigh(${range})`, } // Implementation of `filter_map` from the template. // Sort items in the map from most to least selective. const product_filter_map: QueryMap = { by_name: (name: string) => fql`.where(.name == ${name})`, by_category: (category: string) => fql`.where(.category == Category.byName(${category}).first()!)`, range_price: ({ from, to }: { from?: number; to?: number }) => { // Dynamically filter products by price range. if (from && to) { return fql`.where(.price >= ${from} && .price <= ${to})` } else if (from) { return fql`.where(.price >= ${from})` } else if (to) { return fql`.where(.price <= ${to})` } return fql`` }, } // Hybrid implementation of `index_map` and `filter_map` from the template. // Combines filters and indexes to compose FQL query fragments. // Sort items in the map from most to least selective. const product_filter_with_indexes_map: QueryMap = { by_name: (name: string) => fql`.where(doc => Product.byName(${name}).includes(doc))`, by_category: (category: string) => fql`.where(doc => Product.byCategory(Category.byName(${category}).first()!).includes(doc))`, range_price: (range: { from?: number; to?: number }) => fql`.where(doc => Product.sortedByPriceLowToHigh(${range}).includes(doc))`, } const order_id = (await client.query(fql`Order.all().first()!`)) .data.id const query = build_search( fql`Product.all()`, product_index_priority_map, product_filter_with_indexes_map, [ // { type: "by", name: "name", args: ["limes"] }, // { type: "by", name: "category", args: ["produce"] }, { type: "range", name: "price", args: [{ to: 1000 }] }, { type: "by", name: "order", args: [order_id] }, ] ) const res = await client.query(query) ``` ### [](#check-for-the-existence-of-document-references)Check for the existence of document references When checking for the existence of [document references](../../../data-model/relationships/), use [`set.take()`](../../../../reference/fql-api/set/take/) to avoid unneeded document reads: ```fql // Get a Set of `Customer` documents that have: // - An address in "DC". // - At least one order. Customer // Filter customers by state. .where(.address.state == "DC") // Filter customers by `orders` field. // The `orders` field is a computed field containing a Set of // `Order` documents for the customer. `take(1)` ensures we only // read the first document to check for existence. .where(.orders.take(1).isEmpty() == false) { id, name, email } ``` # Query best practices This guide covers best practices for querying data in Fauna. ## [](#use-indexes-for-commonly-accessed-data)Use indexes for commonly accessed data [Indexes](../../data-model/indexes/) are the most important and effective tool to increase performance and reduce the cost of your queries. Avoid [uncovered queries](../../data-model/indexes/#covered-queries) whenever possible. To reduce document reads, include any frequently queried fields in indexes. | See Indexes | | --- | --- | --- | ## [](#avoid-storing-unneeded-history)Avoid storing unneeded history A [collection schema](../../schema/#collection-schema)'s [`history_days`](../../doc-history/#history-retention) setting defines the number of days of history to retain as document snapshots. You can use these historical snapshots to run [temporal queries](../../doc-history/#temporal-query) or replay events in [event feeds and event streams](../../cdc/). Avoid storing unnecessary history. A high `history_days` setting has several impacts: * **Increased read ops:** To support [temporal queries](../../doc-history/#temporal-query), indexes cover field values from both current documents and their [historical document snapshots](../../doc-history/). To enable quicker [sorting](../../data-model/indexes/#sort-documents) and [range searches](../../data-model/indexes/#range-search), current and historical index entries are stored together, sorted by index `values`. All indexes implicitly include an ascending [document `id`](../../data-model/documents/#meta) as the index’s last value. When you read data from an index, including the [`collection.all()`](../../../reference/fql-api/collection/instance-all/) index, Fauna must read from both current and historical index entries to determine if they apply to the query. Fauna then filters out any data not returned by the query. You are charged for any Transactional Read Operations (TROs) used to read current or historical index data, including data not returned by the query. You are not charged for any historical data older than the retention period set by the [`history_days` setting](../../doc-history/#history-retention). * **Longer index build times:** Because indexes include historical data, a high `history_days` setting can increase the [index build times](../../data-model/indexes/#builds). * **Increased query latency on indexes:** If an indexed field value changes frequently, the index must retain more historical data. A high `history_days` setting can increase query latency on the index. * **Increased storage:** More document snapshots and historical index data is retained, consuming additional database storage and increasing storage costs. ## [](#use-index-terms-for-exact-match-searches)Use index terms for exact match searches For the best performance, especially on large datasets, use an index with terms to filter collection documents based on an exact field value. [`collection.where()`](../../../reference/fql-api/collection/instance-where/) and [`collection.firstWhere()`](../../../reference/fql-api/collection/instance-firstwhere/) require a scan of the entire collection and aren’t performant on large collections. Avoid using frequently updated fields as index terms. See [Avoid using frequently updated fields as index terms](#frequently-updated-terms). | See Index terms | | --- | --- | --- | ## [](#frequently-updated-terms)Avoid using frequently updated fields as index terms Internally, Fauna [partitions](../../data-model/indexes/#partitions) indexes based on its terms, if present. Frequent updates to term field values trigger updates to these partitions. If you need to filter or run an exact match search on a frequently updated field, consider adding the field as an index value instead: ```fsl collection Product { ... // Defines the `sortedByName()` index. // The index includes the `name` field as an index value. // `name` is a frequently updated field. index sortedByName { values [.name, .description, .price] } } ``` Then use the index to run a [range search](../../data-model/indexes/#range-search) on the index value: ```fql // Uses the `sortedByName()` index to run a range search // on `name` field values. The query only retrieves `Product` // collection documents with a `name` of `limes`. The query // is covered and avoids document reads. Product.sortedByName({ from: "limes", to: "limes" }) { name, description, price } ``` ## [](#use-index-values-for-sorting-and-range-searches)Use index values for sorting and range searches For large collections, use index values instead of [`set.order()`](../../../reference/fql-api/set/order/) to sort a collection’s documents. Reserve [`set.order()`](../../../reference/fql-api/set/order/) for small, ad-hoc sorting on Sets of one page or less. Similarly, avoid using [`collection.where()`](../../../reference/fql-api/collection/instance-where/) to perform range searches on large collections. Instead, run a range search on index values. | See Index values | | --- | --- | --- | ## [](#use-projection-to-only-retrieve-fields-you-need)Use projection to only retrieve fields you need Projection lets you select the fields to return from a document or Set. For the best performance and costs, use an index and only project fields covered as an index term or value. This lets you read data from the index rather than the underlying documents. To reduce unneeded compute operations, use projection to only fetch [computed fields](../../../reference/fsl/computed/) when needed. Computed fields aren’t persistently stored as part of the document. Instead, the field’s value is computed on each read. | See Projection and field aliasing | | --- | --- | --- | ## [](#return-null-on-document-writes)Return `null` on document writes Methods that create or write to a document, such as [`Collection.create()`](../../../reference/fql-api/collection/static-create/) or [`document.update()`](../../../reference/fql-api/document/update/), typically return the document. An FQL query only returns the result of its last statement. If you don’t use the returned document, you can add a `null` statement to the end of the query to return `null` instead. This can lower egress costs. For example: ```fql // The `update()` call writes to a `Product` collection document. // `update()` returns the updated document. Product.byId("111") ?.update({ price: 75 }) // FQL queries return the result of the last statement. // The `null` statement ensures the query returns `null`. null ``` ## [](#use-pagesize-for-pagination)Use `pageSize()` for pagination In most cases, you should use [`set.pageSize()`](../../../reference/fql-api/set/pagesize/), not [`set.paginate()`](../../../reference/fql-api/set/paginate/), to control the page size of paginated results. Unlike `pageSize()`, `paginate()`: * Uses eager loading and fetches results instantly, even if the results aren’t returned or used. This can produce slower and more wasteful queries. * Is not compatible with Fauna client driver pagination methods. | See Pagination | | --- | --- | --- | ## [](#create-udfs-for-complex-queries-and-workflows)Create UDFs for complex queries and workflows A [user-defined function (UDF)](../../schema/user-defined-functions/) is a Set of one or more FQL statements stored as a reusable resource in a Fauna database. UDFs are composable, letting you combine multiple UDFs to create more complex functions or workflows. In most cases, a query that’s rewritten in multiple places or that involves complex logic should be stored as a UDF. | See User-defined functions (UDFs) | | --- | --- | --- | # Transactions In a database, a transaction is a sequence of operations performed as a single unit of work. For example, if you transfer money between accounts, both the debit and credit operations must succeed, or neither should occur. In Fauna, every query is an ACID-compliant transaction, even across globally distributed region groups. ## [](#acid)ACID properties Database transactions follow ACID properties to ensure data integrity: * **Atomicity:** All operations in a transaction must be completed successfully or fail entirely. If some operations succeed while others fail, the successes are rolled back before being committed. There is no partial success. * **Consistency:** A transaction can only bring the database from one valid state to another valid state. * **Isolation:** Concurrent transactions should not affect each other. The isolation level determines how transaction integrity is maintained. * **Durability:** Once a transaction is committed, it remains committed. ## [](#isolation)Isolation levels An isolation level refers to how a database maintains the integrity of its transactions. When a transaction is processed, the isolation level determines whether or how the transaction might be affected by other concurrent database operations. Fauna transactions support three different isolation levels: * [Serializability](#serializability) * [Strict Serializability](#strict-serializability) * [Snapshot isolation](#snapshot-isolation) ### [](#serializability)Serializability Serialization guarantees that concurrent transactions produce the same result as if they ran one after another. With serialization, developers don’t need to reason about concurrency. If each transaction is correct in isolation, they’ll be correct when run together. For example, If two users simultaneously update their profiles, serializability ensures both updates complete correctly, as if one happened after the other. ### [](#strict-serializability)Strict Serializability Strict serialization, also called linearization, combines serializability with real-time ordering guarantees. For example, if transaction A completes before transaction B begins, A’s effects are always visible to B. This is crucial for scenarios requiring strict ordering, such as implementing a reservation system where the first request should always win. Strict serialization is the highest isolation available. ### [](#snapshot-isolation)Snapshot isolation Fauna [indexes](../data-model/indexes/) maintain a virtual snapshot of indexed documents. You can think of these snapshots as a "photograph" of the database at a specific time. When you query a Fauna index without strict serialization, each transaction works with its own consistent snapshot of the data. This prevents "dirty" or non-repeatable reads. Snapshot isolation works best for scenarios where you need a consistent view of the data but don’t require strict ordering guarantees, such as generating analytics reports. ## [](#practice)Isolation levels in practice The following table summarizes isolation levels available in Fauna based on a transaction’s operation types: | Index reads | Read-write | Read-only | | --- | --- | --- | --- | --- | | None (documents only) | Strict serializability | Serializable | | Only serialized or unique indexes | Strict serializability | Serializable | | Indexes without serialization | Snapshot isolation | Snapshot isolation | ### [](#read-write)Read-write transactions Read-write (or write-only) transactions are always strictly serialized and don’t require additional configuration. ### [](#read-only)Read-only transactions By default, read-only transactions are serializable for reads that don’t involve indexes. This provides strong consistency without requiring strict ordering guarantees. To opt-in to strict serializability for read-only transactions, you can use either of these methods: * Include an `X-Linearized` header set to `true` for the transaction’s [Query API](../../reference/http/reference/core-api/#operation/query) request. * Include a no-op write in the transaction. #### [](#driver)Driver configuration The Fauna client drivers include configuration options for the `X-Linearized` header: * [JavaScript driver API reference](https://fauna.github.io/fauna-js/latest/interfaces/ClientConfiguration.html#linearized) * [Python driver API reference](https://fauna.github.io/fauna-python/latest/api/fauna/client/client.html#Client) * [Go driver API reference](https://pkg.go.dev/github.com/fauna/fauna-go/v2#Linearized) * [NET/C# driver API reference](https://fauna.github.io/fauna-dotnet/latest/class_fauna_1_1_core_1_1_query_options.html) * [JVM driver API reference](https://fauna.github.io/fauna-jvm/latest/com/fauna/query/QueryOptions.Builder.html) # Contended transactions In Fauna, every query is an [ACID-compliant transaction](../). Contention occurs when multiple transactions try to access the same data at the same time and at least one transaction attempts to write to it. ## [](#causes)Causes of contention * **Write contention** (most common): Occurs when a transaction reads or writes to a document, or reads an [index entry](../../data-model/indexes/), and that document or index entry is concurrently being written to by another transaction. * **Stale schema cache** (less common): Occurs when a change in [database schema](../../schema/) is detected during transaction execution. The transaction is retried against the latest schema. ## [](#retries)Manage retries for contended transactions Fauna detects contention and automatically retries contended transactions. These retries occur within Fauna and don’t require action by the client. Retries consume additional read, write, and/or compute operations and add transaction latency. If you’re using the [Fauna Core HTTP API](../../../reference/http/reference/core-api/), you can use the `X-Max-Contention-Retries` HTTP header to control the number of retry attempts, not including the original query request: ```bash curl -X POST \ 'https://db.fauna.com/query/1' \ -H "Authorization: Bearer $FAUNA_SECRET" \ -H 'Content-Type: application/json' \ -H 'X-Max-Contention-Retries: 5' \ -d '{ "query": "Product.all()" }' ``` `X-Max-Contention-Retries` defaults to `0` (No retries). Most Fauna client drivers include configuration options for the `X-Max-Contention-Retries` header: * [JavaScript driver docs](../../../build/drivers/js-client/#config) * [Python driver docs](../../../build/drivers/py-client/#config) * [Go driver docs](../../../build/drivers/go-client/#config) * [.NET/C# driver docs](../../../build/drivers/dotnet-client/#config) ### [](#errors)Error handling If a contended transaction exhausts retries, the [Core HTTP API](../../../reference/http/reference/core-api/) returns an error with a `contended_transaction` [error code](../../../reference/http/reference/errors/): ```json { "error": { "code": "contended_transaction", "message": "Transaction was aborted due to detection of concurrent modifications to " } } ``` Fauna’s client drivers include classes for contended transaction errors: * JavaScript driver: [`ContendedTransactionError`](https://fauna.github.io/fauna-js/latest/classes/ContendedTransactionError.html) * Python driver: [`ContendedTransactionError`](https://fauna.github.io/fauna-python/latest/api/fauna/errors/errors.html#ContendedTransactionError) * Go driver: [`ErrContendedTransaction`](https://pkg.go.dev/github.com/fauna/fauna-go/v2#ErrContendedTransaction) * .NET/C# driver: [`ContendedTransactionException`](https://fauna.github.io/fauna-dotnet/latest/class_fauna_1_1_exceptions_1_1_contended_transaction_exception.html) * JVM driver: [`ContendedTransactionException`](https://fauna.github.io/fauna-jvm/latest/com/fauna/exception/ContendedTransactionException.html) ## [](#fix)How to minimize contention Depending on your use case, occasional contention may be unavoidable. However, you can follow these best practices to minimize contention. ### [](#strict-ser)Use strict serialization only when needed Fauna uses [strict serialization](../#strict-serializability), or linearization, for all read-write transactions. By default, read-only transactions are serializable but not strictly serialized. If you’re using the [Core HTTP API](../../../reference/http/reference/core-api/), you can use the `X-Linearized` HTTP header to opt-in to strict serializability for read-only transactions: ```bash curl -X POST \ 'https://db.fauna.com/query/1' \ -H "Authorization: Bearer $FAUNA_SECRET" \ -H 'Content-Type: application/json' \ -H 'X-Linearized: true' \ -d '{ "query": "Product.all()" }' ``` The Fauna client drivers include configuration options for the `X-Linearized` header: * [JavaScript driver API reference](https://fauna.github.io/fauna-js/latest/interfaces/ClientConfiguration.html#linearized) * [Python driver API reference](https://fauna.github.io/fauna-python/latest/api/fauna/client/client.html#Client) * [Go driver API reference](https://pkg.go.dev/github.com/fauna/fauna-go/v2#Linearized) * [NET/C# driver API reference](https://fauna.github.io/fauna-dotnet/latest/class_fauna_1_1_core_1_1_query_options.html) * [JVM driver API reference](https://fauna.github.io/fauna-jvm/latest/com/fauna/query/QueryOptions.Builder.html) While it provides the strongest level of consistency, strict serialization can increase the likelihood of contention. Only opt-in for strict serialization on read-only transactions if required for your use case. ### [](#schema)Avoid concurrent schema changes Concurrent [unstaged schema changes](../../schema/manage-schema/#unstaged) can cause [contended transactions](./), even if the changes affect different resources. This includes unstaged changes made using: * The [Fauna CLI](../../schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../reference/http/reference/core-api/#tag/Schema) * [FQL schema methods](../../schema/manage-schema/#fql) A schema change triggers a [transaction that validates the entire database schema](../../schema/#validation). To avoid errors, do one of the following instead: * Run [staged schema changes](../../schema/manage-schema/#staged) * Perform unstaged schema changes sequentially # Document time-to-live (TTL) | Reference: FSL collection schema | | --- | --- | --- | A document can include an optional `ttl` (time-to-live) metadata field that contains the document’s expiration timestamp: ```fql Customer.create({ name: "John Doe", email: "jdoe@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" }, // Sets a `ttl` 60 days after the current time. // After the `ttl` passes, Fauna deletes the document. ttl: Time.now().add(60, "days") }) ``` After the `ttl` timestamp passes, Fauna permanently deletes the document. Use the `ttl` field to automatically expire documents and control data retention. Documents without a `ttl` or a `ttl` of `null` persist indefinitely. ## [](#enable-disable-ttl-writes)Enable or disable TTL writes A collection schema’s `document_ttls` field controls whether you can write to the `ttl` field of collection documents: ```fsl collection Customer { ... // Explicitly set `document_ttls` to `true` document_ttls true ... } ``` If the collection schema contains [field definitions](../../reference/fsl/field-definitions/), `document_ttls` defaults to `false`. Otherwise, `document_ttls` defaults to `true`. `document_ttls` does not affect: * An existing document’s `ttl`. Fauna deletes documents based on their `ttl`, even if `document_ttls` is `false`. * The collection schema’s [`ttl_days`](#set-default-ttl) field. If set, `ttl_days` assigns a `ttl` based on the document’s last write, even if `document_ttls` is `false`. ## [](#set-default-ttl)Set a default TTL A collection schema’s `ttl_days` field sets a default `ttl` for collection documents: ```fsl collection Customer { ... // Set default document `ttl` to 365 days after // the last write. ttl_days 365 ... } ``` When `ttl_days` is set, new and updated collection documents are assigned a `ttl` based on the `ttl_days` value, in days, since the last document write. For example, if `ttl_days` is set to `30`, each document is assigned a `ttl` of 30 days after its most recent update. You can overwrite the default by assigning an explicit `ttl`. ### [](#set-default-ttl-existing-docs)Set a default TTL for existing documents Changing `ttl_days` on an existing collection does not affect the `ttl` of existing, unchanged documents. To explicitly set the `ttl` of all collection documents, use [`set.pageSize()`](../../reference/fql-api/set/pagesize/) and [`set.paginate()`](../../reference/fql-api/set/paginate/) to iterate through the collection over several queries: ```fql // First query. // Gets `Customer` collection documents with a `ttl` // of `null` (unset). Use `pageSize()` // and`paginate()` to paginate results and // limit each page to 20 documents. let page = Customer.all() .where(.ttl == null) .pageSize(20).paginate() let data = page.data data.forEach(document => document.update({ // Set `ttl` as the document timestamp (`ts`) + 365 days. ttl: document.ts.add(365, "days") })) page { after } ``` ``` { after: "hdW..." } ``` Subsequent queries use the cursor and [`Set.paginate()`](../../reference/fql-api/set/static-paginate/) to iterate through the remaining pages: ```fql // Subsequent queries. // Uses `Set.paginate()` to iterate through // subsequent pages. let page = Set.paginate("hdW...") let data = page.data data.forEach(document => document.update({ ttl: document.ts.add(365, "days") })) page { after } ``` ### [](#field-df)Set a default `ttl` using field definitions `ttl` is a [reserved field](../../reference/fql/reserved/#reserved-schema). You can’t use [field definitions](../../reference/fsl/field-definitions/) to set a default `ttl` for new documents. To set a default `ttl`, use [`ttl_days`](#set-default-ttl-existing-docs) or use your client application to set the default in document creation queries. You can [check constraint](../../reference/fsl/check/) to ensure documents have a `ttl` and enforce a `ttl` policy. See [Enforce TTL policies with check constraints](#check-constraints). ## [](#check-constraints)Enforce TTL policies with check constraints You can use a [check constraint](../../reference/fsl/check/) to ensure a `ttl` is set in new and updated documents: ```fsl collection Product { ... // Check constraint. // Requires that documents contain a `ttl`. check ttlPolicy (.ttl != null) ... } ``` You can also ensure `ttl` values meet a pre-defined rule. For example: ```fsl collection Product { ... // Check constraint. Ensures the `ttl` of `Product` // documents is no more than 30 days from the current date. // Requires that documents contain a `ttl`. check ttlPolicy (.ttl?.difference(Time.now(), 'days') <= 30) ... } ``` To enforce a similar rule but make `ttl` optional: ```fsl collection Product { ... // Check constraint. Ensures the `ttl` of `Product` // documents is no more than 30 days from the current date. // Requires that documents contain a `ttl`. check ttlPolicy ( .ttl == null || .ttl?.difference(Time.now(), 'days') <= 30 ) ... } ``` ## [](#ttl-doc-history)TTL and document history When a document’s `ttl` timestamp passes, Fauna only deletes the latest version of the document. Historical snapshots of the document are still available using the [`at` expression](../doc-history/#temporal-query). You can control the retention of historical snapshots using the collection schema’s [`history_days`](../doc-history/) field. ## [](#ttl-event-sources)TTL and event sources [Event sources](../cdc/) don’t emit `remove` events for documents deleted due to an expired TTL. Such documents are deleted lazily upon expiration. # Document history Fauna stores snapshots of each document’s history. Fauna creates these snapshots each time the document receives a write. Fauna indexes also store the history of index `terms` or `values` field values. The historical snapshots act as versions of a document. You can use the snapshots to get a point-in-time view of documents and audit changes. ## [](#temporal-query)Run a temporal query | Reference: at expression | | --- | --- | --- | You can use an [`at` expression](../../reference/fql/statements/#at) to run a query on the snapshot of one or more documents. This is called a temporal query. For example, the following query retrieves a snapshot of a document from yesterday: ```fql let yesterday = Time.now().subtract(1, "day") at (yesterday) { Product.byName("avocados").first() } ``` If available, Fauna returns the document at the time of the `at` expression. ## [](#history-retention)History retention | Reference: FSL collection schema | | --- | --- | --- | A [collection schema](../schema/#collection-schema)'s `history_days` setting defines the number of days of history to retain as document snapshots: ```fsl collection Customer { ... // Number of days of document history to retain. history_days 3 } ``` After `history_days` passes, snapshots before the related time are deleted and become inaccessible. `history_days` also affects events available for event feeds and event streams. See [event feeds and event streams](../cdc/#setup). ### [](#mvt)Minimum viable timestamp (MVT) The minimum viable timestamp (MVT) is the earliest point in time that you can query a collection’s document history. The MVT is calculated as the query timestamp minus the collection’s [`history_days`](#history-retention) setting: ``` MVT = query timestamp - collection's `history_days` ``` [Temporal queries](#temporal-query) using an `at` expression can’t access document snapshots that are older than the MVT. Any query that attempts to access a document snapshot before the MVT returns an error with the `invalid_request` [error code](../../reference/http/reference/errors/) and a 400 HTTP status code: ``` { "code": "invalid_request", "message": "Requested timestamp 2099-01-09T00:36:53.334372Z less than minimum allowed timestamp 2099-01-10T00:21:53.334372Z." } ``` For example, if a collection has `history_days` set to `3` and you run a query that attempts to access a document in the collection from 4 days ago, the query returns an error. ### [](#default-history-days)Default history days If omitted or unset, `history_days` defaults to `0`, which only retains the current version of each document. No history is retained. If `history_days` is `0` or unset, the replay period for events in [event feeds and event streams](../cdc/#setup) is limited to 15 minutes. ### [](#decreasing-history-days)Decreasing history days If you decrease `history_days`, any snapshots created before the new `history_days` setting are deleted and become inaccessible. ### [](#increasing-history-days)Increasing history days Increasing `history_days` does not recreate or recover previously inaccessible snapshots. For example, increasing `history_days` from `0` to `7` does not recreate historical snapshots for the last 7 days. Instead, Fauna begins storing snapshots beginning at the time of the schema update. ### [](#impacts)Impacts on read ops, storage, latency, and indexing Avoid storing unnecessary history. A high `history_days` setting has several impacts: * **Increased read ops:** To support [temporal queries](#temporal-query), indexes cover field values from both current documents and their [historical document snapshots](./). To enable quicker [sorting](../data-model/indexes/#sort-documents) and [range searches](../data-model/indexes/#range-search), current and historical index entries are stored together, sorted by index `values`. All indexes implicitly include an ascending [document `id`](../data-model/documents/#meta) as the index’s last value. When you read data from an index, including the [`collection.all()`](../../reference/fql-api/collection/instance-all/) index, Fauna must read from both current and historical index entries to determine if they apply to the query. Fauna then filters out any data not returned by the query. You are charged for any Transactional Read Operations (TROs) used to read current or historical index data, including data not returned by the query. You are not charged for any historical data older than the retention period set by the [`history_days` setting](#history-retention). * **Longer index build times:** Because indexes include historical data, a high `history_days` setting can increase the [index build times](../data-model/indexes/#builds). * **Increased query latency on indexes:** If an indexed field value changes frequently, the index must retain more historical data. A high `history_days` setting can increase query latency on the index. * **Increased storage:** More document snapshots and historical index data is retained, consuming additional database storage and increasing storage costs. # Event feeds and event streams A Fauna [event source](#event-source) emits an event whenever tracked changes are made to a database. Applications can consume the events in two ways: * **[Event feeds](#event-feeds)** : Asynchronous requests that poll the event source for paginated events. * **[Event streams](#event-streaming)**: A real-time subscription that pushes events from the event source to your application using an open connection to Fauna. ## [](#use-cases)Use cases [Event feeds](#event-feeds) and [event streams](#event-streaming) are useful for building features that need to react to data changes, such as: * Real-time dashboards * Chat apps * Pub/sub integration * Multiplayer games ## [](#setup)Before you start Event feeds and event streams replay events from a collection’s [document history](../doc-history/). The number of days of history to retain is defined by the [collection schema](../schema/#collection-schema)'s [`history_days`](../doc-history/#history-retention) setting: ```fsl collection Customer { ... // Number of days of document history to retain. history_days 3 } ``` You can’t replay events outside this period. If `history_days` is `0` or unset, the period is limited to 15 minutes. Increasing `history_days` can also [impact read ops, storage, latency, and indexing](../doc-history/#impacts). ## [](#event-source)Create an event source To create an event source, call [`set.eventSource()`](../../reference/fql-api/set/eventsource/) or [`set.eventsOn()`](../../reference/fql-api/set/eventson/) on a [supported Set](#sets) in an FQL query: * [`set.eventSource()`](../../reference/fql-api/set/eventsource/) tracks all documents in the Set: ```fql Product.all().eventSource() ``` * [`set.eventsOn()`](../../reference/fql-api/set/eventson/) tracks changes to specified document fields in the Set: ```fql Product.sortedByPriceLowToHigh().eventsOn(.price) ``` ### [](#token)Event source tokens [`set.eventSource()`](../../reference/fql-api/set/eventsource/) and [`set.eventsOn()`](../../reference/fql-api/set/eventson/) return a string-encoded token that represents the event source. The token has the [EventSource](../../reference/fql/types/#event-source) type: ``` "g9WD1YPG..." ``` You can use the token to consume the event source as an [event feed](#event-feeds) or a [event stream](#event-streaming). Applications typically create feeds and streams using a [Fauna client driver](../../build/drivers/). The drivers provide methods for creating feeds and streams without directly handling event source tokens. #### [](#timestamp)Event source token composition The event source token is a hash that includes: * The event source query. The query determines the events returned in event feeds and event streams that consume the source. * The snapshot timestamp for the query that created the event source. This timestamp is the default start time for event feeds or event streams that consume the source. Event tokens are generated deterministically but are not idempotent. Running the same event source query multiple times produces different event source tokens. ## [](#event-feeds)Event feeds To use event feeds, you must have a Pro or Enterprise plan. The following Fauna client drivers support event feeds: * [JavaScript driver](../../build/drivers/js-client/#event-feeds) * [Python driver](../../build/drivers/py-client/#event-feeds) * [Go driver](../../build/drivers/go-client/#event-feeds) * [.NET/C# driver](../../build/drivers/dotnet-client/#event-feeds) * [JVM driver](../../build/drivers/jvm-client/#event-feeds) ### [](#example)Example With the [JavaScript driver](../../build/drivers/js-client/), use `feed()` to define an event source and return the event source’s paginated events. To get the first page of events, you typically specify a `start_ts` (start timestamp) in the initial `feed()` request. Each page of events includes a top-level `cursor`. In subsequent requests, you can pass this `cursor` instead of a `start_ts` to `feed()`. This polls for events after the cursor (exclusive): ```javascript import { Client, fql } from "fauna"; const client = new Client(); async function processFeed(client, query, startTs = null, sleepTime = 300) { let cursor = null; while (true) { // Only include `start_ts `if `cursor` is null. Otherwise, only include `cursor`. const options = cursor === null ? { start_ts: startTs } : { cursor: cursor }; const feed = client.feed(query, options); for await (const page of feed) { for (const event of page.events) { switch (event.type) { case "add": console.log("Add event: ", event); break; case "update": console.log("Upodate event: ", event); break; case "remove": console.log("Remove event: ", event); break; } } // Store the cursor of the last page cursor = page.cursor; } // Clear startTs after the first request startTs = null; console.log(`Sleeping for ${sleepTime} seconds...`); await new Promise(resolve => setTimeout(resolve, sleepTime * 1000)); } } const query = fql`Product.all().eventsOn(.price, .stock)`; // Calculate timestamp for 10 minutes ago const tenMinutesAgo = new Date(Date.now() - 10 * 60 * 1000); const startTs = Math.floor(tenMinutesAgo.getTime() / 1000) * 1000000; processFeed(client, query, startTs); ``` If needed, you can store the cursor as a collection document. For an example, see the [event feeds sample app](../../build/sample-apps/event-feeds/). ### [](#event-feeds-sample-app)Event feeds sample app The [event feeds sample app](../../build/sample-apps/event-feeds/) shows how you can use event feeds to track changes to a database. The app uses an AWS Lambda function to send events for related changes to another service. | See Event feeds sample app | | --- | --- | --- | ### [](#how-event-feeds-work)How event feeds work To request an event feed for an event source, the client driver sends a request containing an [event source token](#token) to the [event feed HTTP API endpoint](../../reference/http/reference/core-api/#operation/feed). When you first poll an event source using an event feed, you usually specify a `start_ts` (start timestamp). `start_ts` is an integer representing a time in microseconds since the Unix epoch. The request returns events that occurred after the specified timestamp (exclusive). `page_size` limits the number of events returned per page: ```bash curl -X POST \ 'https://db.fauna.com/feed/1' \ -H "Authorization: Bearer $FAUNA_SECRET" \ -H 'Content-Type: application/json' \ -d '{ "token": "", "start_ts": 1710968002310000 "page_size": 10 }' ``` The response includes an array of events for the event source: ``` { "events": [ { "type": "update", "data": { "@doc": { "id": "111", "coll": { "@mod": "Product" }, "ts": { "@time": "2099-09-04T21:14:29.970Z" }, "name": "cups", "description": "Translucent 9 Oz, 100 ct", ... } }, "txn_ts": 1725484469970000, "cursor": "gsGabc123", "stats": { "read_ops": 1, "storage_bytes_read": 320, "compute_ops": 1, "processing_time_ms": 1, "rate_limits_hit": [] } }, ... ], "cursor": "gsGabc456", // Top-level cursor "has_next": true, "stats": { "read_ops": 9, "storage_bytes_read": 886, "compute_ops": 1, "processing_time_ms": 8, "rate_limits_hit": [] } } ``` If the response’s `has_next` property is `true`, the response includes a top-level `cursor` property. The client driver can use this cursor to get the next page of events: ```bash curl -X POST \ 'https://db.fauna.com/feed/1' \ -H "Authorization: Bearer $FAUNA_SECRET" \ -H 'Content-Type: application/json' \ -d '{ "token": "", "cursor": "gsGabc456", "page_size": 10 }' ``` Response: ``` { "events": [ { "type": "update", "data": { "@doc": { "id": "111", "coll": { "@mod": "Product" }, "ts": { "@time": "2099-09-04T21:14:29.970Z" }, "name": "clear cups", "description": "Translucent 9 Oz, 100 ct", ... } }, "txn_ts": 1725484469970000, "cursor": "gsGabc456", "stats": { "read_ops": 1, "storage_bytes_read": 320, "compute_ops": 1, "processing_time_ms": 1, "rate_limits_hit": [] } }, ... ], "cursor": "gsGabc789", "has_next": true, "stats": { "read_ops": 9, "storage_bytes_read": 886, "compute_ops": 1, "processing_time_ms": 8, "rate_limits_hit": [] } } ``` You can reuse cursors across event sources with identical queries in the same database. ### [](#feed-start-ts)Get events after a specific start time To get events after a specific time, the client driver uses the `start_ts` request body parameter: ```bash curl -X POST \ 'https://db.fauna.com/feed/1' \ -H "Authorization: Bearer $FAUNA_SECRET" \ -H 'Content-Type: application/json' \ -d '{ "token": "", "page_size": 10, "start_ts": 1710968002310000 }' ``` The period between the request and the `start_ts` can’t exceed the `history_days` setting for the source Set’s collection. If `history_days` is `0` or unset, the period is limited to 15 minutes. Requests that use a `start_ts` older than this period return an [`error` event](#error) with the [`invalid_start_time` error code](#invalid-start). ### [](#feed-cursor)Get events after a specific cursor To get events from a previous event’s [cursor](#cursor), the client driver uses the `cursor` request body parameter. The event source will replay events that occurred after the cursor (exclusive): ```bash curl -X POST \ 'https://db.fauna.com/feed/1' \ -H "Authorization: Bearer $FAUNA_SECRET" \ -H 'Content-Type: application/json' \ -d '{ "token": "", "cursor": "gsGabc456", "page_size": 10 }' ``` The period between the request and the `cursor` event’s `txn_ts` (transaction timestamp) can’t exceed the `history_days` setting for the [source Set](#sets)'s collection. If `history_days` is `0` or unset, the period is limited to 15 minutes. Requests that use a cursor older than this period return an [`error` event](#error) with the [`invalid_start_time` error code](#invalid-start). You can reuse cursors across event sources with identical queries in the same database. ### [](#feed-default)Default start time If an event feed request doesn’t specify a [`start_ts`](#feed-start-ts) (start timestamp) or [`cursor`](#feed-cursor), the request’s `start_ts` defaults to the [event source query’s timestamp](#timestamp). If the timestamp is outside the [history retention period](../doc-history/#history-retention) of the [source Set](#sets)'s collection, the request returns an [`error` event](#error) with the [`invalid_start_time` error code](#invalid-start). ## [](#event-streaming)Event streams The following Fauna client drivers support real-time event streams: * [JavaScript driver](../../build/drivers/js-client/#event-streaming) * [Python driver](../../build/drivers/py-client/#event-streaming) * [Go driver](../../build/drivers/go-client/#event-streaming) * [.NET/C# driver](../../build/drivers/dotnet-client/#event-streaming) * [JVM driver](../../build/drivers/jvm-client/#event-streaming) ### [](#example-2)Example With the [JavaScript driver](../../build/drivers/js-client/), you use the `stream()` function to define and subscribe to an event source in real time: ```javascript import { Client, fql } from "fauna"; const client = new Client(); const query = fql`Product.where(.type == 'book' && .price < 100_00).eventSource()`; const stream = client.stream(query); try { for await (const event of stream) { switch (event.type) { case "add": // Do something on add console.log(event.data); break; case "update": // Do something on update console.log(event.data); break; case "remove": // Do something on remove console.log(event.data); break; } } } catch (error) { console.log(error); } ``` You can also pass an [event source token](#token) to `stream()`. This lets you get query results alongside the stream: ```javascript import { Client, fql } from "fauna"; const client = new Client(); const query = fql` let products = Product.where( .type == 'book' && .price < 100_00) { products: products, eventSource: products.eventSource() }`; const response = await client.query(query); const { products, eventSource } = response.data; for await (const product of client.paginate(products)) { console.log(product); } const stream = client.stream(eventSource); try { for await (const event of stream) { switch (event.type) { case "add": // Do something on add console.log(event.data); break; case "update": // Do something on update console.log(event.data); break; case "remove": // Do something on remove console.log(event.data); break; } } } catch (error) { console.log(error); } ``` ### [](#event-streams-sample-app)Event streams sample app The [event streams sample app](../../build/sample-apps/streaming/) shows how you can use event streams to build a real-time chat app. You can use it as a starting point for your own app. | See Event streams sample app | | --- | --- | --- | ### [](#how-event-streams-work)How event streams work To subscribe to an event source’s events in real time, the client driver sends a request containing the [event source token](#token) to the [Event stream HTTP API endpoint](../../reference/http/reference/core-api/#operation/stream): ```bash curl -X POST \ 'https://db.fauna.com/stream/1' \ -H "Authorization: Bearer $FAUNA_SECRET" \ -H 'Content-Type: application/json' \ -d '{ "token": "" }' ``` In response, the event source emits a `status` event, indicating the stream has started. ```json { "type": "status", "txn_ts": 1710968002310000, "cursor": "gsGabc123", "stats": { "read_ops": 8, "storage_bytes_read": 208, "compute_ops": 1, "processing_time_ms": 0, "rate_limits_hit": [] } } ``` The [Event stream API request](../../reference/http/reference/core-api/#operation/stream)'s connection remains open. If a tracked change occurs, the event source emits a related `add`, `remove`, or `update` event. These events include the triggering document, encoded using the [tagged format](../../reference/http/reference/wire-protocol/#tagged), in the `data` field: ```json { "type": "update", "data": { "@doc": { "id": "392914348360597540", "coll": { "@mod": "Product" }, "ts": { "@time": "2099-03-21T12:35:18.680Z" }, "name": "pizza", "description": "Frozen Cheese", ... } }, "txn_ts": 1711024518680000, "cursor": "gsGdef456", "stats": { ... } } ``` If a change occurs between the creation of the event source and the start of a stream, the stream replays and emits the related events. ### [](#stream-default)Default start time If an event stream request doesn’t specify a [`start_ts`](#restart-txn-ts) (start timestamp) or [`cursor`](#restart-cursor), `start_ts` defaults to the [event source query’s timestamp](#timestamp). If the timestamp is outside the [history retention period](../doc-history/#history-retention) of the [source Set](#sets)'s collection, the stream returns an [`error` event](#error) with the [`invalid_start_time` error code](#invalid-start). ### [](#disconnection)Stream disconnection Fauna’s client drivers can detect connection loss and automatically reconnect disconnected [event stream](#event-streaming). Events that occur during network issues are replayed and emitted when the stream reconnects. When a stream reconnects, the event source emits a new `status` event: ```json { "type": "status", "txn_ts": 1710968002310000, "cursor": "gsGabc123", "stats": { "read_ops": 8, "storage_bytes_read": 208, "compute_ops": 1, "processing_time_ms": 0, "rate_limits_hit": [] } } ``` ### [](#restart)Restart an event stream The [Event stream HTTP API endpoint](../../reference/http/reference/core-api/#operation/stream) supports two methods for restarting disconnected streams: * [Restart from an event cursor](#restart-cursor) * [Restart from a transaction timestamp](#restart-txn-ts) The methods are mutually exclusive and can’t be used together. #### [](#restart-cursor)Restart from an event cursor To restart a stream from a previous event’s [cursor](#cursor), the client driver uses the `cursor` request body parameter. The restarted stream will replay events that occurred after the cursor (exclusive): ```bash curl -X POST \ 'https://db.fauna.com/stream/1' \ -H "Authorization: Bearer $FAUNA_SECRET" \ -H 'Content-Type: application/json' \ -d '{ "token": "", "cursor": "gsGabc123" }' ``` The period between the stream restart and the `cursor` event’s `txn_ts` (transaction timestamp) can’t exceed the `history_days` setting for the source Set’s collection. If `history_days` is `0` or unset, the period is limited to 15 minutes. Requests that use a cursor older than this period return an [`error` event](#error) with the [`invalid_start_time` error code](#invalid-start). #### [](#restart-txn-ts)Restart from a transaction timestamp To restart a stream from a transaction timestamp, the client driver uses the `start_ts` request body parameter. `start_ts` is an integer representing the stream start time in microseconds since the Unix epoch: ```bash curl -X POST \ 'https://db.fauna.com/stream/1' \ -H "Authorization: Bearer $FAUNA_SECRET" \ -H 'Content-Type: application/json' \ -d '{ "token": "", "start_ts": 1710968002310000 }' ``` The period between the stream restart and the `start_ts` can’t exceed the `history_days` setting for the source Set’s collection. If `history_days` is `0` or unset, the period is limited to 15 minutes. Requests that use a `start_ts` older than this period return an [`error` event](#error) with the [`invalid_start_time` error code](#invalid-start). For event streams, `start_ts` must be after the [event source query’s timestamp](#timestamp). ### [](#event-streams-sample-app-2)Event streams sample app The [Event streams sample app](../../build/sample-apps/streaming/) show how you can use event streams to build a real-time chat app. You can use it as a starting point for your own app. | See Event streams sample app | | --- | --- | --- | ### [](#permission-changes)Permission changes If the [authentication secret](../security/authentication/) used to create an event source is revoked or the secret’s privileges change, the stream consuming the event source closes due to permission loss. This applies even if the secret still has access to the documents the event source is tracking. ## [](#sets)Supported Sets You can only create an event source on a supported Set. The Set can only contain documents from a user-defined collection. The source’s Set affects the exact behavior of [`set.eventSource()`](../../reference/fql-api/set/eventsource/) or [`set.eventsOn()`](../../reference/fql-api/set/eventson/). | Supported Set | Behavior | | --- | --- | --- | --- | | User-defined collection | set.eventSource() emits events for any change to any document in the Set, including the addition or removal of documents.set.eventsOn() emits events for any change to specified document fields in the Set. It also emits events for changes that add or remove documents with the specified fields from the Set.You can’t create an event source on a system collection. | | User-defined index | set.eventSource() emits events for changes to the index’s terms or values fields for documents in the Set. It also emits events for changes that add or remove documents from the Set.set.eventsOn() emits events for changes to specified terms or values fields in the Set. You can only specify terms or values fields.You can’t create an event source on an index for a system collection. | | Document | set.eventSource() emits events for changes to any field for the document.set.eventsOn() emits events for changes to specified fields for the document. | ### [](#coll)Collection event sources Calling [`set.eventSource()`](../../reference/fql-api/set/eventsource/) directly on [`collection.all()`](../../reference/fql-api/collection/instance-all/) tracks any change to any document in the collection. The following query tracks any change to documents in the `Product` collection: ```fql Product.all().eventSource() ``` For example, if you change a `Product` document’s `price` to below `100_00`, the event source emits an `update` event. You can use [`collection.where()`](../../reference/fql-api/collection/instance-where/) to filter the tracked documents for a collection. For example, the following query only tracks `Product` documents with a `price` of less than `100_00`. ```fql Product.where(.price < 100_00).eventSource() ``` If you change a `Product` document’s `price` from above `100_00` to below `100_00`, the event source emits an `add` event. Before the change, the document would not have been part of the event source’s Set. You can use [`set.eventsOn()`](../../reference/fql-api/set/eventson/) to only track changes to specific fields. The following query tracks changes made to any `Product` document’s `description`. The event source doesn’t emit events for changes to other fields. ```fql Product.all().eventsOn(.description) ``` ### [](#index)Index event sources Index event sources only emit events for changes to the index’s `terms` or `values` fields. For example, the following `Product` collection’s `byCategory()` index has: * A _term_ field of `category` * _Value_ fields of `name` and `price` ```fsl collection Product { *: Any index byCategory { terms [.category] values [.name, .price] } ... } ``` The following query only tracks changes to the `category`, `name`, or `price` fields for `Product` documents with a `category` of `produce`. ```fql let produce = Category.byName("produce").first() Product.byCategory(produce).eventSource() ``` When called on an index, [`set.eventsOn()`](../../reference/fql-api/set/eventson/) only accepts the index’s `terms` or `values` fields as arguments. For example, in the following query, [`set.eventsOn()`](../../reference/fql-api/set/eventson/) only accepts `.category`, `.name`, or `.price` as arguments. ```fql let produce = Category.byName("produce").first() Product.byCategory(produce).eventsOn(.category, .name) ``` ### [](#doc)Document event sources You can use event sources to track changes to a Set containing a single document. These event sources only emit events when the document changes. Use [`Set.single()`](../../reference/fql-api/set/static-single/) to create a Set from a document. ```fql let product = Product.byId(111)! Set.single(product).eventSource() ``` Use [`set.eventsOn()`](../../reference/fql-api/set/eventson/) to only track changes to specific fields of the document. ```fql let product = Product.byId(111)! Set.single(product).eventsOn(.name, .price) ``` ### [](#resource-deletion)Resource deletion If the database or source for an event source is deleted, the event source won’t emit any further events. [event streams](#event-streaming) for the event source don’t automatically close. ## [](#transformations-filters)Supported transformations and filters Event sources only support [source Sets](#sets) that are transformed or filtered using: * [`set.where()`](../../reference/fql-api/set/where/) * [`set.map()`](../../reference/fql-api/set/map/) * [Projection](../../reference/fql/projection/) This ensures Fauna can convert the Set to an event source. Sets using unsupported transformations or filters will fail to convert. For example, the Set for the following event source uses the unsupported [`set.drop()`](../../reference/fql-api/set/drop/) method. ```fql Product.all().drop(10).eventSource() ``` Running the query returns the following error: ``` invalid_receiver: can't call `.eventSource()` because streaming is not supported on sets returned from `.drop()`. error: can't call `.eventSource()` because streaming is not supported on sets returned from `.drop()`. at *query*:1:35 | 1 | Product.all().drop(10).eventSource() | ^^ | ``` ### [](#filter)Filters Use [`set.where()`](../../reference/fql-api/set/where/) to filter an event source’s Set. For example, the following query only tracks changes to `Product` documents with: * A `category` of `produce` * A `price` less than `100_00` ```fql let produce = Category.byName("produce").first() Product .all() .where(.category == produce) .where(.price < 100_00) .eventSource() ``` You can also call [`set.where()`](../../reference/fql-api/set/where/) directly on [`set.eventSource()`](../../reference/fql-api/set/eventsource/) or [`set.eventsOn()`](../../reference/fql-api/set/eventson/). The following query is equivalent to the previous one. ```fql let produce = Category.byName("produce").first() Product .all() .eventSource() .where(.category == produce) .where(.price < 100_00) ``` [`set.where()`](../../reference/fql-api/set/where/) produces a new Set based on its criteria. The criteria affect the event types emitted for changes: * Creating a document in the Set produces an `add` event. * Updating a document so that it moves into the Set produces an `add` event. * Updating a document so that it remains in the Set produces an `update` event. * Updating a document so that it moves out of the Set produces a `remove` event. * Deleting a document from the Set produces a `remove` event. * Any other changes produce no events. While filters affect events emitted for an event source, they don’t affect event processing, which impacts performance and cost. See [How filters affect costs and performance](#how-filters-affect-costs-and-performance). ### [](#project)Projection An event source’s `add` and `update` event types include a `data` field. This field contains the document that triggered the event. Use [`set.map()`](../../reference/fql-api/set/map/) or [projection](../../reference/fql/projection/) to return only specific document fields in these events. For example, the following query tracks changes to any field in any `Product` document. The query uses [`set.map()`](../../reference/fql-api/set/map/) to only include the `name` and `price` document fields in the `data` field of `add` and `update` events. ```fql Product .all() .map(product => { name: product.name, price: product.price }) .eventSource() ``` The following query uses [projection](../../reference/fql/projection/) and is equivalent to the previous one. ```fql let products = Product.all() { name, price } products.eventSource() ``` The previous queries can produce the following `add` event. The event’s `data` field includes only the `name` and `price` document fields. ```json { "type": "add", "data": { "name": "pizza", "price": "1599" }, "txn_ts": 1711028312060000, "cursor": "gsGghu789", "stats": { "read_ops": 1, "storage_bytes_read": 69, "compute_ops": 1, "processing_time_ms": 0, "rate_limits_hit": [] } } ``` ## [](#events)Events Event sources emit one event per document per transaction. ### [](#event-order)Event order Events are ordered by ascending `txn_ts` (transaction timestamp). Events from the same transaction share the same `txn_ts`, but their order may differ in event streams across clients. Event feeds return events in the same order across clients. ### [](#event-types)Event types The following table outlines supported event types. | Event type | Sent when …​ | | --- | --- | --- | --- | | add | A document is added to the Set. | | remove | A document is removed from the Set.Event sources don’t emit remove events for documents deleted due to an expired TTL. Such documents are deleted lazily upon expiration. | | update | A document in the Set changes. | | status | An event stream starts or reconnects. Streams also periodically emit status events to:Keep the client connection open.Send stats on operations consumed by event processing, including discarded events that aren’t sent.See the status event schema.Event feeds don’t receive or include status events. | | error | An event source can no longer be consumed due to an error.See the error event schema and Error codes. | ### [](#event-schema)Event schema Events with a type other than `status` or `error` have the following schema: ```json { "type": "add", "data": { "@doc": { "id": "392914348360597540", "coll": { "@mod": "Product" }, "ts": { "@time": "2099-03-20T21:46:12.580Z" }, "foo": "bar" } }, "txn_ts": 1710968002310000, "cursor": "gsGabc123", "stats": { "read_ops": 8, "storage_bytes_read": 208, "compute_ops": 1, "processing_time_ms": 0, "rate_limits_hit": [] } } ``` `status` event types have the following schema: ```json { "type": "status", "txn_ts": 1710968002310000, "cursor": "gsGabc123", "stats": { "read_ops": 0, "storage_bytes_read": 0, "compute_ops": 0, "processing_time_ms": 0, "rate_limits_hit": [] } } ``` `error` event types have the following schema: ```json { "type": "error", "error": { "code": "invalid_stream_start_time", "message": "Stream start time 2099-09-05T14:27:10.100Z is too far in the past. Recreate the stream and try again." }, "stats": { "read_ops": 0, "storage_bytes_read": 0, "compute_ops": 0, "processing_time_ms": 0, "rate_limits_hit": [] } } ``` | Field name | Type | Description | | --- | --- | --- | --- | --- | | type | string | Event type: add, remove, update, status, or error.Event feeds don’t receive or include status events. | | data | object | Document that triggered the event. FQL values are encoded using the tagged format.The status and error event types don’t include this property. | | error | object | Contains an error for the event source. Only error event type includes this property. Field nameTypeDescriptioncodestringCode for the error. Error codes are part of the API contract and are safe to write programmatic logic against.See Error codes for a list of possible error codes.messagestringHuman-readable description of the error. | Field name | Type | Description | code | string | Code for the error. Error codes are part of the API contract and are safe to write programmatic logic against.See Error codes for a list of possible error codes. | message | string | Human-readable description of the error. | | Field name | Type | Description | | code | string | Code for the error. Error codes are part of the API contract and are safe to write programmatic logic against.See Error codes for a list of possible error codes. | | message | string | Human-readable description of the error. | | txn_ts | integer | The related transaction’s commit time in microseconds since the Unix epoch.The error event type doesn’t include this property. | | cursor | string | Cursor for the event. The Fauna HTTP API and client drivers can use the cursor to replay events that occurred after the cursor. See Restart from an event cursor.The error event type doesn’t include this property. | | stats | object | Event statistics. Field nameTypeDescriptionread_opsintegerTransactional Read Operations (TROs) consumed by the event.storage_bytes_readintegerAmount of data read from storage, in bytes.compute_opsintegerTransactional Compute Operations (TCOs) consumed by the event.processing_time_msintegerEvent processing time in milliseconds.rate_limits_hitarrayOperations that exceeded their rate limit. | Field name | Type | Description | read_ops | integer | Transactional Read Operations (TROs) consumed by the event. | storage_bytes_read | integer | Amount of data read from storage, in bytes. | compute_ops | integer | Transactional Compute Operations (TCOs) consumed by the event. | processing_time_ms | integer | Event processing time in milliseconds. | rate_limits_hit | array | Operations that exceeded their rate limit. | | Field name | Type | Description | | read_ops | integer | Transactional Read Operations (TROs) consumed by the event. | | storage_bytes_read | integer | Amount of data read from storage, in bytes. | | compute_ops | integer | Transactional Compute Operations (TCOs) consumed by the event. | | processing_time_ms | integer | Event processing time in milliseconds. | | rate_limits_hit | array | Operations that exceeded their rate limit. | ### [](#error-codes)Error codes The following table outlines possible error codes for `error` events. | Error code | Cause | | --- | --- | --- | --- | | internal_error | An internal error caused by Fauna. | | invalid_stream_start_time | The requested cursor or start time is too far in the past. The collection containing the stream’s document Set doesn’t retain enough history to replay requested events. | | permission_loss | The authentication secret used to create the event source was revoked or the secret’s privileges changed. See Permission changes. | | stream_overflow | The event source attempts to process more than 128 events at once, exceeding the event limit. | | stream_replay_volume_exceeded | The event source would replay more than 128 events at once, exceeding the event limit. | ## [](#cost-performance)Costs and performance An event source’s cost and performance are closely related to its _shape_. An event source’s shape is defined by: * The [source Set](#sets) * [Transformations and filters](#transformations-filters) applied to the source Set Processing and sending events consume Transactional Read Operations (TROs) and Transactional Compute Operations (TCOs). The exact number of TROs and TCOs consumed varies based on the event source’s shape. Depending on its cardinality and throughput, consuming an event source for a large Set may cause delays in event delivery and consume more operations. If an event source replays events, it may also consume additional operations. Each event includes [stats](#stats) that report consumed operations. If you exceed your Fauna’s or your plan’s operations limit, the event source emits an error event. For event streams, this closes the stream. ### [](#how-filters-affect-costs-and-performance)How filters affect costs and performance Event sources may discard events based on [filters](#transformations-filters). For example, an event source with the following query uses a filter to only emit events for `Product` documents with a `category` of `produce`: ```fql let produce = Category.byName("produce").first() Product .all() .where(.category == produce) .eventSource() ``` To do this, Fauna processes an event for any change to any `Product` document. It then discards events for documents without a `category` of `produce`. These discarded events still consume operations for your account. To track changes on a large Set, we recommend using an [index event source](#index). For example, the following event source emits events similar to the previous one. However, it only tracks the index’s `terms` and `values` fields: ```fql let produce = Category.byName("produce").first() Product .byCategory(produce) .eventSource() ``` Another source of discarded events is privilege predicates in roles. For example, the following role uses predicates to grant its members read and write access only to `Product` documents with a `category` of `produce`: ```fsl role ProduceManager { privileges Product { write { predicate ((product, _) => product?.category?.name == "produce") } read { predicate (product => product?.category?.name == "produce") } } } ``` An event source created using an [authentication secret](../security/authentication/) with this role only emits events for documents the role can access. Other events are discarded. These discarded events still consume operations for your account. ## [](#limits)Limitations * [Operation limits](../../reference/requirements-limits/) apply to event sources. * While processing events, Fauna runs one query per transaction. * An event source can’t replay or process more than 128 events at a time. If an event source has more than 128 events to process, Fauna closes the event source with an error event. * You can’t create event sources for: * A [system collection](../data-model/collections/#system-coll) * An [index](../data-model/indexes/) for a [system collection](../data-model/collections/#system-coll). * A [set](../../reference/fql/types/#set) that combines documents from one or more collections. # Best practices for event feeds and event streams This guide covers best practices for creating and consuming [event sources](../) using [event feeds](../#event-feeds) or [event streams](../#event-streaming). ## [](#configure-history_days)Configure `history_days` Event feeds and event streams replay events from a collection’s [document history](../../doc-history/). The number of days of history to retain is defined by the [collection schema](../../schema/#collection-schema)'s [`history_days`](../../doc-history/#history-retention) setting: ```fsl collection Customer { ... // Number of days of document history to retain. history_days 3 } ``` You can’t replay events outside this period. If `history_days` is `0` or unset, the period is limited to 15 minutes. Increasing `history_days` can also [impact read ops, storage, latency, and indexing](../../doc-history/#impacts). ## [](#follow-query-best-practices)Follow query best practices When composing FQL queries for event sources, follow [best practices for queries](../../query/best-practices/). ## [](#manage-the-event-sources-set-and-shape)Manage the event source’s Set and shape An event source’s cost and performance are closely related to its [Set](../#sets) and [shape](../#transformations-filters). For more details, see [Costs and performance](../#cost-performance) in the event feeds and event streams reference docs. ## [](#use-projection-to-limit-event-fields)Use projection to limit event fields [Projection](../../../reference/fql/projection/) lets you select the fields to include in an event source’s events. For the best performance and costs, use an [index](../../data-model/indexes/) and only project fields covered as an index term or value. This lets the event source read data from the index rather than the underlying documents. | See Projecting only the data you need into the event source | | --- | --- | --- | ## [](#use-index-terms-to-filter-tracked-sets)Use index terms to filter tracked Sets For the best performance, especially on large datasets, use an index with [terms](../../data-model/indexes/#terms) to filter collection documents based on an exact field value. For example, instead of using [`collection.where()`](../../../reference/fql-api/collection/instance-where/) to perform an equality comparison: ```fql let produce = Category.byName("produce").first() let products = Product .where(.category == produce && .price < 100_00) { category, name, description, price } products.eventSource() ``` You can create an index definition with `category` as an index term and `price` as an index value: ```fsl collection Product { ... index byCategory { terms [.category] values [.price, .name, .description] } } ``` Then update the query to replace the equality comparison with an index call: ```fql let produce = Category.byName("produce").first() let products = Product .byCategory(produce) .where( .price < 100_00) { category, name, description, price } products.eventSource() ``` | See Index event sources | | --- | --- | --- | ## [](#use-a-key-with-a-built-in-role)Use a key with a built-in role If you don’t need identity-based authentication, use a key with a built-in role to authenticate event feed and event stream requests. An event source only emits events for documents the authentication secret can access. Event feed and event stream requests that use a key with the same built-in role can access the same documents and always emit the same events. Internally, this lets Fauna unify event processing for these event sources. Event sources that use a token or a JWT with user-defined roles may discard different events based on [attribute-based access control (ABAC)](../../security/abac/). To account for this, Fauna must process events for these event sources individually. | See Costs and performance | | --- | --- | --- | ## [](#use-page-cursors-when-using-event-feeds-plan-to-track-filtered-sets)Use page cursors when using event feeds to track filtered Sets When polling an event source using an event feed, use the feed’s top-level page `cursor` to periodically get events for a [filtered Set](../#filter). The cursor tracks any events that were processed but not emitted for the event source. For strict filters, this can be a large number of events. Using the cursor can prevent excessive event replays and [event limit](../#transaction-limit) errors. # Fauna v10 (current) client drivers You can connect your application to Fauna using one of Fauna’s official client drivers. ![JavaScript](../_images/drivers/logos/javascript.svg) [JavaScript](js-client/) ![Python](../_images/drivers/logos/python.svg) [Python](py-client/) ![Go](../_images/drivers/logos/golang.svg) [Go](go-client/) ![C#](../_images/drivers/logos/csharp.svg) [.NET/C#](dotnet-client/) ![JVM](../_images/drivers/logos/java.svg) [Java](jvm-client/) Each driver is an open-source wrapper for the [Fauna Core HTTP API](../../reference/http/reference/core-api/). They’re lightweight enough for use in serverless functions, edge networks, IoT devices, and other resource-constrained environments. ## [](#driver-versioning)Driver versioning The Fauna drivers use [semantic versioning](https://semver.org/). Each driver is versioned independently. The above drivers can only be used with FQL v10. # Fauna v10 JavaScript client driver (current) | Version: 2.5.0 | Repository: fauna/fauna-js | | --- | --- | --- | --- | Fauna’s JavaScript client driver lets you run FQL queries from JavaScript or TypeScript applications. This guide shows how to set up the driver and use it to run FQL queries. ## [](#supported-runtimes)Supported runtimes The driver supports the following runtime environments. ### [](#server-side)Server-side Node.js - [Current and active LTS versions](https://nodejs.org/en/about/releases/): * Current - v20 * LTS - v18 ### [](#cloud-providers)Cloud providers * Cloudflare Workers * AWS Lambda (See [AWS Lambda connections](#aws-lambda-connections)) * Netlify * Vercel ### [](#browsers)Browsers Stable versions of: * Chrome 69+ * Firefox 62+ * Safari 12.1+ * Edge 79+ ## [](#installation)Installation The driver is available on [npm](https://www.npmjs.com/package/fauna). Install it using your preferred package manager: ```bash npm install fauna ``` Browsers can import the driver using a CDN link: ```html ``` ## [](#api-reference)API reference API reference documentation for the driver is available at [https://fauna.github.io/fauna-js/](https://fauna.github.io/fauna-js/). ## [](#sample-app)Sample app For a practical example, check out the [JavaScript sample app](https://github.com/fauna/js-sample-app). This sample app is an e-commerce application that uses Node.js and the Fauna JavaScript driver. The source code includes comments highlighting best practices for using the driver and composing FQL queries. ## [](#basic-usage)Basic usage The following application: * Initializes a client instance to connect to Fauna * Composes a basic FQL query using an `fql` string template * Runs the query using `query()` ```javascript import { Client, fql, FaunaError } from "fauna"; // Use `require` for CommonJS: // const { Client, fql, FaunaError } = require('fauna'); // Initialize the client to connect to Fauna const client = new Client({ secret: 'FAUNA_SECRET' }); try { // Compose a query const query = fql` Product.sortedByPriceLowToHigh() { name, description, price }`; // Run the query const response = await client.query(query); console.log(response.data); } catch (error) { if (error instanceof FaunaError) { console.log(error); } } finally { // Clean up any remaining resources client.close(); } ``` ## [](#connect-to-fauna)Connect to Fauna Each Fauna query is an independently authenticated request to the Core HTTP API’s [Query endpoint](../../../reference/http/reference/core-api/#operation/query). You authenticate with Fauna using an [authentication secret](../../../learn/security/authentication/#secrets). ### [](#get-an-authentication-secret)Get an authentication secret Fauna supports several [secret types](../../../learn/security/authentication/#secret-types). For testing, you can create a [key](../../../learn/security/keys/), which is a type of secret: 1. Log in to the [Fauna Dashboard](https://dashboard.fauna.com/). 2. On the **Explorer** page, create a database. 3. In the database’s **Keys** tab, click **Create Key**. 4. Choose a **Role** of **server**. 5. Click **Save**. 6. Copy the **Key Secret**. The secret is scoped to the database. ### [](#initialize-a-client)Initialize a client To send query requests to Fauna, initialize a `Client` instance using a Fauna authentication secret: ```javascript const client = new Client({ secret: 'FAUNA_SECRET' }); ``` If not specified, `secret` defaults to the `FAUNA_SECRET` environment variable. For other configuration options, see [Client configuration](#config). ### [](#connect-to-a-child-database)Connect to a child database A [scoped key](../../../learn/security/keys/#scoped-keys) lets you use a parent database’s admin key to send query requests to its child databases. For example, if you have an admin key for a parent database and want to connect to a child database named `childDB`, you can create a scoped key using the following format: ``` // Scoped key that impersonates an `admin` key for // the `childDB` child database. fn...:childDB:admin ``` You can then initialize a `Client` instance using the scoped key: ```javascript const client = new Client({ secret: 'fn...:childDB:admin' }); ``` ### [](#multiple-connections)Multiple connections You can use a single client instance to run multiple asynchronous queries at once. The driver manages HTTP connections as needed. Your app doesn’t need to implement connection pools or other connection management strategies. You can create multiple client instances to connect to Fauna using different credentials or client configurations. ### [](#aws-lambda-connections)AWS Lambda connections AWS Lambda freezes, thaws, and reuses execution environments for Lambda functions. See [Lambda execution environment](https://docs.aws.amazon.com/lambda/latest/dg/running-lambda-code.html). When an execution environment is thawed, Lambda only runs the function’s handler code. Objects declared outside of the handler method remain initialized from before the freeze. Lambda doesn’t re-run initialization code outside the handler. Fauna drivers keep socket connections that can time out during long freezes, causing `ECONNRESET` errors when thawed. To prevent timeouts, create Fauna client connections inside function handlers. Fauna drivers use lightweight HTTP connections. You can create new connections for each request while maintaining good performance. ## [](#run-fql-queries)Run FQL queries Use `fql` string templates to compose FQL queries. Run the queries using `query()`: ```javascript const query = fql`Product.sortedByPriceLowToHigh()`; client.query(query) ``` By default, `query()` uses query options from the [Client configuration](#config). You can pass options to `query()` to override these defaults. See [Query options](#query-opts). You can only compose FQL queries using string templates. ### [](#var)Variable interpolation Use `${}` to pass native JavaScript variables to `fql` queries: ```javascript // Create a native JS var const collectionName = "Product"; // Pass the var to an FQL query const query = fql` let collection = Collection(${collectionName}) collection.sortedByPriceLowToHigh()`; client.query(query); ``` The driver encodes interpolated variables to an appropriate [FQL type](../../../reference/fql/types/) and uses the [wire protocol](../../../reference/http/reference/wire-protocol/) to pass the query to the Core HTTP API’s [Query endpoint](../../../reference/http/reference/core-api/#operation/query). This helps prevent injection attacks. ### [](#query-composition)Query composition You can use variable interpolation to pass FQL string templates as query fragments to compose an FQL query: ```javascript // Create a reusable query fragment. const product = fql`Product.byName("pizza").first()`; // Use the fragment in another FQL query. const query = fql` let product = ${product} product { name, price }`; client.query(query); ``` ### [](#pagination)Pagination Use `paginate()` to iterate through a Set that contains more than one page of results. `paginate()` accepts the same [Query options](#query-opts) as `query()`. ```javascript // Adjust `pageSize()` size as needed. const query = fql` Product.sortedByPriceLowToHigh() .pageSize(2)`; const pages = client.paginate(query); for await (const products of pages) { for (const product of products) { console.log(product) // ... } } ``` Use `flatten()` to get paginated results as a single, flat array: ```javascript const pages = client.paginate(query); for await (const product of pages.flatten()) { console.log(product) } ``` ### [](#query-stats)Query stats Successful query responses and `ServiceError` errors include [query stats](../../../reference/http/reference/query-stats/): ```javascript try { const response = await client.query(fql`"Hello world"`); console.log(response.stats); } catch (error) { if (error instanceof ServiceError) { const info = error.queryInfo; const stats = info.stats; } } ``` Output: ```json { compute_ops: 1, read_ops: 0, write_ops: 0, query_time_ms: 0, contention_retries: 0, storage_bytes_read: 0, storage_bytes_write: 0, rate_limits_hit: [], attempts: 1 } ``` ## [](#typescript-support)TypeScript support The driver supports TypeScript. For example, you can apply a type parameter to your FQL query results: ```typescript import { fql, Client, type QuerySuccess } from "fauna"; const client = new Client({ secret: 'FAUNA_SECRET' }); type Customer = { name: string; email: string; }; const query = fql`{ name: "Alice Appleseed", email: "alice.appleseed@example.com", }`; const response: QuerySuccess = await client.query(query); const customer_doc: Customer = response.data; console.assert(customer_doc.name === "Alice Applesee"); console.assert(customer_doc.email === "alice.appleseed@example.com"); ``` Alternatively, you can apply a type parameter directly to your `fql` statements and `Client` methods will infer your return types. Due to backwards compatibility, if a type parameter is provided to a `Client` method, the provided type will override the inferred type from your query. ```typescript const query = fql`{ name: "Alice", email: "alice@site.example", }`; // Response will be typed as `QuerySuccess`. const response = await client.query(query); // `userDoc` will be automatically inferred as `User`. const userDoc = response.data; console.assert(userDoc.name === "Alice"); console.assert(userDoc.email === "alice@site.example"); client.close(); ``` ## [](#config)Client configuration The `Client` instance comes with reasonable configuration defaults. We recommend using the defaults in most cases. If needed, you can configure the client to override the defaults. This also lets you set default [Query options](#query-opts). ```javascript import { Client, endpoints } from "fauna"; const config = { // Configure the client client_timeout_buffer_ms: 5000, endpoint: endpoints.default, fetch_keepalive: false, http2_max_streams: 100, http2_session_idle_ms: 5000, secret: "FAUNA_SECRET", // Set default query options format: "tagged", linearized: false, long_type: "number", max_attempts: 3, max_backoff: 20, max_contention_retries: 5, query_tags: { tag: "value" }, query_timeout_ms: 60_000, traceparent: "00-750efa5fb6a131eb2cf4db39f28366cb-000000000000000b-00", typecheck: true, }; const client = new Client(config); ``` For supported properties, see [ClientConfiguration](https://fauna.github.io/fauna-js/latest/interfaces/ClientConfiguration.html) in the API reference. ### [](#environment-variables)Environment variables By default, `secret` and `endpoint` default to the respective `FAUNA_SECRET` and `FAUNA_ENDPOINT` environment variables. For example, if you set the following environment variables: ```bash export FAUNA_SECRET=FAUNA_SECRET export FAUNA_ENDPOINT=https://db.fauna.com/ ``` You can initialize the client with a default configuration: ```javascript const client = new Client(); ``` ### [](#retries)Retries By default, the client automatically retries query requests that return a `limit_exceeded` [error code](../../../reference/http/reference/errors/). Retries use an exponential backoff. Use the [Client configuration](#config)'s `max_backoff` property to set the maximum time between retries. Similarly, use `max_attempts` to set the maximum number of retry attempts, including the initial request. ## [](#query-opts)Query options The [Client configuration](#config) sets default query options for the following methods: * `query()` * `paginate()` You can pass a `QueryOptions` object to override these defaults: ```javascript const options = { arguments: { name: "Alice" }, format: "tagged", linearized: false, long_type: "number", max_contention_retries: 5, query_tags: { tag: "value" }, query_timeout_ms: 60_000, traceparent: "00-750efa5fb6a131eb2cf4db39f28366cb-000000000000000b-00", typecheck: true, }; client.query(fql`"Hello, #{name}!"`, options); ``` For supported properties, see [QueryOptions](https://fauna.github.io/fauna-js/latest/interfaces/QueryOptions.html) in the API reference. ## [](#event-feeds)Event feeds The driver supports [event feeds](../../../learn/cdc/#event-feeds). An event feed asynchronously polls an [event source](../../../learn/cdc/#create-an-event-source) for paginated events. To use event feeds, you must have a Pro or Enterprise plan. ### [](#request-an-event-feed)Request an event feed To get an event source, append [`set.eventSource()`](../../../reference/fql-api/set/eventsource/) or [`set.eventsOn()`](../../../reference/fql-api/set/eventson/) to a [supported Set](../../../learn/cdc/#sets). To get paginated events, pass the event source to `feed()`: ```javascript const response = await client.query(fql` let set = Product.all() { initialPage: set.pageSize(10), eventSource: set.eventSource() } `); const { initialPage, eventSource } = response.data; const feed = client.feed(eventSource); ``` If changes occur between the creation of the event source and the event feed request, the feed replays and emits any related events. You can also pass a query that produces an event source directly to `feed()`: ```javascript const query = fql`Product.all().eventsOn(.price, .stock)`; const feed = client.feed(query); ``` In most cases, you’ll get events after a specific [event cursor](#cursor) or [start time](#start-time). #### [](#start-time)Get events after a specific start time When you first poll an event source using an event feed, you usually include a `start_ts` (start timestamp) in the [`FeedClientConfiguration` object](#event-feed-opts) that’s passed to `feed()`. The request returns events that occurred after the specified timestamp (exclusive). `start_ts` is an integer representing a time in microseconds since the Unix epoch: ```typescript // Calculate timestamp for 10 minutes ago const tenMinutesAgo = new Date(Date.now() - 10 * 60 * 1000); // Convert to microseconds const startTs = Math.floor(tenMinutesAgo.getTime() / 1000) * 1000000; const options: FeedClientConfiguration = { start_ts: startTs }; const feed = client.feed(fql`Product.all().eventSource()`, options); ``` #### [](#cursor)Get events after a specific event cursor After the initial request, you usually get subsequent events using the [cursor](../../../learn/cdc/#cursor) for the last page or event. To get events after a cursor (exclusive), include the `cursor` in the [`FeedClientConfiguration` object](#event-feed-opts) that’s passed to `feed()`: ```typescript const options: FeedClientConfiguration = { // Cursor for a previous page cursor: "gsGabc456" }; const feed = client.feed(fql`Product.all().eventSource()`, options); ``` ### [](#loop)Iterate on an event feed `feed()` returns a `FeedClient` instance that acts as an `AsyncIterator`. You can use `for await...of` to iterate through the pages of events: ```javascript const query = fql`Product.all().eventSource()`; // Calculate timestamp for 10 minutes ago const tenMinutesAgo = new Date(Date.now() - 10 * 60 * 1000); const startTs = Math.floor(tenMinutesAgo.getTime() / 1000) * 1000000; const options: FeedClientConfiguration = { start_ts: startTs }; const feed = client.feed(query, options); for await (const page of feed) { console.log("Page stats", page.stats); for (const event of page.events) { switch (event.type) { case "add": // Do something on add console.log("Add event: ", event); break; case "update": // Do something on update console.log("Update event: ", event); break; case "remove": // Do something on remove console.log("Remove event: ", event); break; } } } ``` Alternatively, use `flatten()` to get events as a single, flat array: ```javascript const query = fql`Product.all().eventSource()`; // Calculate timestamp for 10 minutes ago const tenMinutesAgo = new Date(Date.now() - 10 * 60 * 1000); const startTs = Math.floor(tenMinutesAgo.getTime() / 1000) * 1000000; const options = { start_ts: startTs }; const feed = client.feed(query, options); for await (const event of feed.flatten()) { switch (event.type) { case "add": // Do something on add console.log("Add event: ", event); break; case "update": // Do something on update console.log("Update event: ", event); break; case "remove": // Do something on remove console.log("Remove event: ", event); break; } } ``` The event feed iterator will stop when there are no more events to poll. Each page includes a top-level `cursor`. You can include the cursor in a [`FeedClientConfiguration` object](#event-feed-opts) passed to `feed()` to poll for events after the cursor: ```typescript import { Client, fql } from "fauna"; const client = new Client(); async function processFeed(client, query, startTs = null, sleepTime = 300) { let cursor = null; while (true) { // Only include `start_ts `if `cursor` is null. Otherwise, only include `cursor`. const options = cursor === null ? { start_ts: startTs } : { cursor: cursor }; const feed = client.feed(query, options); for await (const page of feed) { for (const event of page.events) { switch (event.type) { case "add": console.log("Add event: ", event); break; case "update": console.log("Upodate event: ", event); break; case "remove": console.log("Remove event: ", event); break; } } // Store the cursor of the last page cursor = page.cursor; } // Clear startTs after the first request startTs = null; console.log(`Sleeping for ${sleepTime} seconds...`); await new Promise(resolve => setTimeout(resolve, sleepTime * 1000)); } } const query = fql`Product.all().eventsOn(.price, .stock)`; // Calculate timestamp for 10 minutes ago const tenMinutesAgo = new Date(Date.now() - 10 * 60 * 1000); const startTs = Math.floor(tenMinutesAgo.getTime() / 1000) * 1000000; processFeed(client, query, startTs); ``` If needed, you can store the cursor as a collection document: ```typescript import { Client, fql } from "fauna"; const client = new Client(); async function processFeed(client, query, startTs = null, sleepTime = 300) { // Create the `Cursor` collection. await client.query( fql` if (Collection.byName("Cursor").exists() == false) { Collection.create({ name: "Cursor", fields: { name: { signature: "String" }, value: { signature: "String?" } }, constraints: [ { unique: [ { field: ".name", mva: false } ] } ], indexes: { byName: { terms: [ { field: ".name", mva: false } ] } }, }) } else { null } ` ); // Create a `ProductInventory` document in the `Cursor` collection. // The document holds the latest cursor. await client.query( fql` if (Collection("Cursor").byName("ProductInventory").first() == null) { Cursor.create({ name: "ProductInventory", value: null }) } else { null } ` ); while (true) { // Get existing cursor from the `Cursor` collection. const cursorResponse = await client.query( fql`Cursor.byName("ProductInventory").first()` ); let cursor = cursorResponse.data?.value || null; // Only include `start_ts `if `cursor` is null. Otherwise, only include `cursor`. const options = cursor === null ? { start_ts: startTs } : { cursor: cursor }; const feed = client.feed(query, options); for await (const page of feed) { for (const event of page.events) { switch (event.type) { case "add": console.log("Add event: ", event); break; case "update": console.log("Update event: ", event); break; case "remove": console.log("Remove event: ", event); break; } } // Store the cursor of the last page cursor = page.cursor; await client.query( fql` Cursor.byName("ProductInventory").first()!.update({ value: ${cursor} }) ` ); console.log(`Cursor updated: ${cursor}`); } // Clear startTs after the first request startTs = null; console.log(`Sleeping for ${sleepTime} seconds...`); await new Promise(resolve => setTimeout(resolve, sleepTime * 1000)); } } const query = fql`Product.all().eventsOn(.price, .stock)`; // Calculate timestamp for 10 minutes ago const tenMinutesAgo = new Date(Date.now() - 10 * 60 * 1000); const startTs = Math.floor(tenMinutesAgo.getTime() / 1000) * 1000000; processFeed(client, query, startTs).catch(console.error); ``` ### [](#error-handling)Error handling Exceptions can be raised at two different places: * While fetching a page * While iterating a page’s events This distinction allows for you to ignore errors originating from event processing. For example: ```javascript const tenMinutesAgo = new Date(Date.now() - 10 * 60 * 1000); const startTs = Math.floor(tenMinutesAgo.getTime() / 1000) * 1000000; const options = { start_ts: startTs }; const feed = client.feed(fql` Product.all().map(.name.toUpperCase()).eventSource() `, options); try { for await (const page of feed) { // Pages will stop at the first error encountered. // Therefore, its safe to handle an event failures // and then pull more pages. try { for (const event of page.events) { console.log("Event: ", event); } } catch (error: unknown) { console.log("Feed event error: ", error); } } } catch (error: unknown) { console.log("Non-retryable error: ", error); } ``` Each page’s `cursor` contains the cursor for the page’s last successfully processed event. If you’re using a [loop to poll for changes](#loop), using the cursor will skip any events that caused errors. ### [](#event-feed-opts)Event feed options The client configuration sets the default options for `feed()`. You can pass a `FeedClientConfiguration` object to override these defaults: ```typescript const options: FeedClientConfiguration = { long_type: "number", max_attempts: 5, max_backoff: 1000, query_timeout_ms: 5000, client_timeout_buffer_ms: 5000, secret: "FAUNA_SECRET", cursor: undefined, start_ts: undefined, }; client.feed(fql`Product.all().eventSource()`, options); ``` For supported properties, see [FeedClientConfiguration](https://fauna.github.io/fauna-js/latest/types/FeedClientConfiguration.html) in the API reference. ## [](#event-streaming)Event streams The driver supports [event streams](../../../learn/cdc/). ### [](#start-a-stream)Start a stream To get an event source, append [`set.eventSource()`](../../../reference/fql-api/set/eventsource/) or [`set.eventsOn()`](../../../reference/fql-api/set/eventson/) to a [supported Set](../../../learn/cdc/#sets). To stream the source’s events, pass the event source to `stream()`: ```javascript const response = await client.query(fql` let set = Product.all() { initialPage: set.pageSize(10), eventSource: set.eventSource() } `); const { initialPage, eventSource } = response.data; client.stream(eventSource) ``` You can also pass a query that produces an event source directly to `stream()`: ```javascript const query = fql`Product.all().eventsOn(.price, .stock)` client.stream(query) ``` ### [](#iterate-on-a-stream)Iterate on a stream You can iterate on the stream using an async loop: ```javascript try { for await (const event of stream) { switch (event.type) { case "update": case "add": case "remove": console.log("Stream event:", event); // ... break; } } } catch (error) { // An error will be handled here if Fauna returns a terminal, "error" event, or // if Fauna returns a non-200 response when trying to connect, or // if the max number of retries on network errors is reached. // ... handle fatal error } ``` Or you can use a callback function: ```javascript stream.start( function onEvent(event) { switch (event.type) { case "update": case "add": case "remove": console.log("Stream event:", event); // ... break; } }, function onFatalError(error) { // An error will be handled here if Fauna returns a terminal, "error" event, or // if Fauna returns a non-200 response when trying to connect, or // if the max number of retries on network errors is reached. // ... handle fatal error } ); ``` ### [](#close-a-stream)Close a stream Use `close()` to close a stream: ```javascript const stream = await client.stream(fql`Product.all().eventSource()`) let count = 0; for await (const event of stream) { console.log("Stream event:", event); // ... count++; // Close the stream after 2 events if (count === 2) { stream.close() break; } } ``` ### [](#stream-options)Stream options The [Client configuration](#config) sets default options for the `stream()` method. You can pass a `StreamClientConfiguration` object to override these defaults: ```javascript const options = { long_type: "number", max_attempts: 5, max_backoff: 1000, secret: "FAUNA_SECRET", status_events: true, }; client.stream(fql`Product.all().eventSource()`, options) ``` For supported properties, see [StreamClientConfiguration](https://fauna.github.io/fauna-js/latest/types/StreamClientConfiguration.html) in the API reference. ### [](#sample-app-2)Sample app For a practical example that uses the JavaScript driver with event streams, check out the [event streaming sample app](../../sample-apps/streaming/). ## [](#debug-logging)Debug logging To enable or disable debug logging, set the `FAUNA_DEBUG` environment variable to a string-encoded [`LOG_LEVELS`](https://fauna.github.io/fauna-js/latest/variables/LOG_LEVELS.html) integer: ```shell # Enable logging for warnings (3) and above: export FAUNA_DEBUG="3" ``` Logs are output to `console` methods. If `FAUNA_DEBUG` is not set or is invalid, logging is disabled. For advanced logging, you can pass a custom log handler using the [client configuration](#config)'s `logger` property: ```js import { Client, LOG_LEVELS } from "fauna"; import { CustomLogHandler } from "./your-logging-module"; // Create a client with a custom logger. const client = new Client({ logger: new CustomLogHandler(LOG_LEVELS.DEBUG), }); ``` # Fauna v10 Python client driver (current) | Version: 2.4.0 | Repository: fauna/fauna-python | | --- | --- | --- | --- | Fauna’s Python client driver lets you run FQL queries from Python applications. This guide shows how to set up the driver and use it to run FQL queries. ## [](#supported-python-versions)Supported Python versions * Python 3.9 * Python 3.10 * Python 3.11 * Python 3.12 ## [](#supported-cloud-runtimes)Supported cloud runtimes * AWS Lambda (See [AWS Lambda connections](#aws-lambda-connections)) * Vercel Functions ## [](#installation)Installation The driver is available on [PyPI](https://pypi.org/project/fauna/). To install it, run: ```bash pip install fauna ``` ## [](#api-reference)API reference API reference documentation for the driver is available at [https://fauna.github.io/fauna-python/](https://fauna.github.io/fauna-python/). ## [](#sample-app)Sample app For a practical example, check out the [Python sample app](https://github.com/fauna/python-sample-app). This sample app is an e-commerce application that uses Python3, Flask, and the Fauna Python driver. The source code includes comments highlighting best practices for using the driver and composing FQL queries. ## [](#basic-usage)Basic usage The following application: * Initializes a client instance to connect to Fauna * Composes a basic FQL query using an `fql` string template * Runs the query using `query()` ```python from fauna import fql from fauna.client import Client from fauna.encoding import QuerySuccess from fauna.errors import FaunaException # Initialize the client to connect to Fauna client = Client(secret='FAUNA_SECRET') try: # Compose a query query = fql( """ Product.sortedByPriceLowToHigh() { name, description, price }""" ) # Run the query res: QuerySuccess = client.query(query) print(res.data) except FaunaException as e: print(e) finally: # Clean up any remaining resources client.close() ``` ## [](#connect-to-fauna)Connect to Fauna Each Fauna query is an independently authenticated request to the Core HTTP API’s [Query endpoint](../../../reference/http/reference/core-api/#operation/query). You authenticate with Fauna using an [authentication secret](../../../learn/security/authentication/#secrets). ### [](#get-an-authentication-secret)Get an authentication secret Fauna supports several [secret types](../../../learn/security/authentication/#secret-types). For testing, you can create a [key](../../../learn/security/keys/), which is a type of secret: 1. Log in to the [Fauna Dashboard](https://dashboard.fauna.com/). 2. On the **Explorer** page, create a database. 3. In the database’s **Keys** tab, click **Create Key**. 4. Choose a **Role** of **server**. 5. Click **Save**. 6. Copy the **Key Secret**. The secret is scoped to the database. ### [](#initialize-a-client)Initialize a client To send query requests to Fauna, initialize a `Client` instance using a Fauna authentication secret: ```python client = Client(secret='FAUNA_SECRET') ``` If not specified, `secret` defaults to the `FAUNA_SECRET` environment variable. For other configuration options, see [Client configuration](#config). ### [](#connect-to-a-child-database)Connect to a child database A [scoped key](../../../learn/security/keys/#scoped-keys) lets you use a parent database’s admin key to send query requests to its child databases. For example, if you have an admin key for a parent database and want to connect to a child database named `childDB`, you can create a scoped key using the following format: ``` // Scoped key that impersonates an `admin` key for // the `childDB` child database. fn...:childDB:admin ``` You can then initialize a `Client` instance using the scoped key: ```python client = Client(secret='fn...:childDB:admin') ``` ### [](#multiple-connections)Multiple connections You can use a single client instance to run multiple asynchronous queries at once. The driver manages HTTP connections as needed. Your app doesn’t need to implement connection pools or other connection management strategies. You can create multiple client instances to connect to Fauna using different credentials or client configurations. ### [](#aws-lambda-connections)AWS Lambda connections AWS Lambda freezes, thaws, and reuses execution environments for Lambda functions. See [Lambda execution environment](https://docs.aws.amazon.com/lambda/latest/dg/running-lambda-code.html). When an execution environment is thawed, Lambda only runs the function’s handler code. Objects declared outside of the handler method remain initialized from before the freeze. Lambda doesn’t re-run initialization code outside the handler. Fauna drivers keep socket connections that can time out during long freezes, causing `ECONNRESET` errors when thawed. To prevent timeouts, create Fauna client connections inside function handlers. Fauna drivers use lightweight HTTP connections. You can create new connections for each request while maintaining good performance. ## [](#run-fql-queries)Run FQL queries Use `fql` string templates to compose FQL queries. Run the queries using `query()`: ```python query = fql("Product.sortedByPriceLowToHigh()") client.query(query) ``` By default, `query()` uses query options from the [Client configuration](#config). You can pass options to `query()` to override these defaults. See [Query options](#query-opts). You can only compose FQL queries using string templates. ### [](#var)Variable interpolation The driver supports queries with Python primitives, lists, and dicts. Use `${}` to pass native Python variables to `fql` queries as kwargs. You can escape a variable by prepending an additional `$`. ```python # Create a native Python var collection_name = 'Product' # Pass the var to an FQL query query = fql(''' let collection = Collection(${collection_name}) collection.sortedByPriceLowToHigh()''', collection_name=collection_name) client.query(query); ``` The driver encodes interpolated variables to an appropriate [FQL type](../../../reference/fql/types/) and uses the [wire protocol](../../../reference/http/reference/wire-protocol/) to pass the query to the Core HTTP API’s [Query endpoint](../../../reference/http/reference/core-api/#operation/query). This helps prevent injection attacks. ### [](#query-composition)Query composition You can use variable interpolation to pass FQL string templates as query fragments to compose an FQL query: ```python # Create a reusable query fragment. product = fql('Product.byName("pizza").first()') # Use the fragment in another FQL query. query = fql(f''' let product = {product} product {{ name, price }} ''') client.query(query) ``` ### [](#pagination)Pagination Use `paginate()` to iterate through a Set that contains more than one page of results. `paginate()` accepts the same [Query options](#query-opts) as `query()`. ```python # Adjust `pageSize()` size as needed. query = fql(''' Product.sortedByPriceLowToHigh() .pageSize(2)''') pages = client.paginate(query) for products in pages: for product in products: print(product) ``` ### [](#query-stats)Query stats Successful query responses and `ServiceError` errors return [query stats](../../../reference/http/reference/query-stats/): ```python from fauna import fql from fauna.client import Client from fauna.errors import ServiceError client = Client(secret='FAUNA_SECRET') try: query = fql('"Hello world"') res = client.query(query) print(res.stats) except ServiceError as e: if e.stats is not None: print(e.stats) # more error handling... ``` ### [](#user-defined-classes)User-defined classes Serialization and deserialization with user-defined classes is not supported. When composing FQL queries, adapt your classes into dicts or lists. When instantiating classes from a query result, build them from the expected result. ```python class MyClass: def __init__ (self, my_prop): self.my_prop = my_prop def to_dict(self): return { 'my_prop': self.my_prop } @static_method def from_result(obj): return MyClass(obj['my_prop']) ``` ## [](#config)Client configuration The `Client` instance comes with reasonable configuration defaults. We recommend using the defaults in most cases. If needed, you can configure the client to override the defaults. This also lets you set default [Query options](#query-opts). ```python from datetime import timedelta from fauna.client import Client from fauna.client.headers import Header from fauna.client.endpoints import Endpoints config = { # Configure the client 'secret': 'FAUNA_SECRET', 'endpoint': Endpoints.Default, 'client_buffer_timeout': timedelta(seconds=5), 'http_read_timeout': None, 'http_write_timeout': timedelta(seconds=5), 'http_connect_timeout': timedelta(seconds=5), 'http_pool_timeout': timedelta(seconds=5), 'http_idle_timeout': timedelta(seconds=5), 'max_attempts': 3, 'max_backoff': 20, # Set default query options 'additional_headers': {'foo': 'bar'}, 'linearized': False, 'max_contention_retries': 5, 'query_tags': {'tag': 'value'}, 'query_timeout': timedelta(seconds=60), 'typecheck': True, } client = Client(**config) ``` For supported parameters, see [Client](https://fauna.github.io/fauna-python/latest/api/fauna/client/client.html#Client) in the API reference. ### [](#environment-variables)Environment variables By default, `secret` and `endpoint` default to the respective `FAUNA_SECRET` and `FAUNA_ENDPOINT` environment variables. For example, if you set the following environment variables: ```bash export FAUNA_SECRET=FAUNA_SECRET export FAUNA_ENDPOINT=https://db.fauna.com/ ``` You can initialize the client with a default configuration: ```python client = Client() ``` ### [](#retries)Retries By default, the client automatically retries query requests that return a `limit_exceeded` [\[error code](../../../reference/http/reference/errors/). Retries use an exponential backoff. Use the [Client configuration](#config)'s `max_backoff` parameter to set the maximum time between retries. Similarly, use `max_attempts` to set the maximum number of retry attempts, including the initial request. ## [](#query-opts)Query options The [Client configuration](#config) sets default query options for the following methods: * `query()` * `paginate()` You can pass a `QueryOptions` object to override these defaults: ```python options = QueryOptions( additional_headers={'foo': 'bar'}, linearized=False, max_contention_retries=5, query_tags={'name': 'hello world query'}, query_timeout=timedelta(seconds=60), traceparent='00-750efa5fb6a131eb2cf4db39f28366cb-000000000000000b-00', typecheck=True ) client.query(fql('"Hello world"'), options) ``` For supported properties, see [QueryOptions](https://fauna.github.io/fauna-python/latest/api/fauna/client/client.html#QueryOptions) in the API reference. ## [](#event-feeds)Event feeds The driver supports [event feeds](../../../learn/cdc/#event-feeds). An event feed asynchronously polls an [event source](../../../learn/cdc/) for events. To use event feeds, you must have a Pro or Enterprise plan. ### [](#request-an-event-feed)Request an event feed To get an event source, append [`set.eventSource()`](../../../reference/fql-api/set/eventsource/) or [`set.eventsOn()`](../../../reference/fql-api/set/eventson/) to a [supported Set](../../../learn/cdc/#sets). To get paginated events, pass the event source to `feed()`: ```python from fauna import fql from fauna.client import Client client = Client() response = client.query(fql(''' let set = Product.all() { initialPage: set.pageSize(10), eventSource: set.eventSource() } ''')) initial_page = response.data['initialPage'] event_source = response.data['eventSource'] feed = client.feed(event_source) ``` If changes occur between the creation of the event source and the `feed()` request, the feed replays and emits any related events. You can also pass a query that produces an event source directly to `feed()`: ```python query = fql('Product.all().eventsOn(.price, .stock)') feed = client.feed(query) ``` In most cases, you’ll get events after a specific [start time](#start-time) or [cursor](#cursor). #### [](#start-time)Get events after a specific start time When you first poll an event source using an event feed, you usually include a `start_ts` (start timestamp) in the [`FeedOptions` object](#event-feed-opts) that’s passed to `feed()`. The request returns events that occurred after the specified timestamp (exclusive). `start_ts` is an integer representing a time in microseconds since the Unix epoch: ```python from fauna import fql from fauna.client import Client, FeedOptions from datetime import datetime, timedelta client = Client() # Calculate timestamp for 10 minutes ago ten_minutes_ago = datetime.now() - timedelta(minutes=10) # Convert to microseconds start_ts = int(ten_minutes_ago.timestamp() * 1_000_000) options = FeedOptions( start_ts=start_ts ) feed = client.feed(fql('Product.all().eventSource()'), options) ``` #### [](#cursor)Get events after a specific cursor After the initial request, you usually get subsequent events using the [cursor](../../../learn/cdc/#cursor) for the last page or event. To get events after a cursor (exclusive), include the `cursor` in the [`FeedOptions` object](#event-feed-opts) that’s passed to `feed()`: ```python from fauna import fql from fauna.client import Client, FeedOptions from datetime import datetime, timedelta client = Client() options = FeedOptions( # Cursor for a previous page cursor='gsGabc456' ) feed = client.feed(fql('Product.all().eventSource()'), options) ``` ### [](#loop)Iterate on an event feed `feed()` returns an iterator that emits pages of events. You can use a for loop to iterate through the pages: ```python query = fql('Product.all().eventsOn(.price, .stock)') # Calculate timestamp for 10 minutes ago ten_minutes_ago = datetime.now() - timedelta(minutes=10) start_ts = int(ten_minutes_ago.timestamp() * 1_000_000) options = FeedOptions( start_ts=start_ts ) feed = client.feed(query, options) for page in feed: print('Page stats: ', page.stats) for event in page: eventType = event['type'] if (eventType == 'add'): # Do something on add print('Add event: ', event) elif (eventType == 'update'): # Do something on update print('Update event: ', event) elif (eventType == 'remove'): # Do something on remove print('Remove event: ', event) ``` The event feed iterator will stop once there are no more events to poll. Each page includes a top-level `cursor`. You can include the cursor in a [`FeedOptions` object](#event-feed-opts) passed to `feed()` to poll for events after the cursor: ```python import time from datetime import datetime, timedelta from fauna import fql from fauna.client import Client, FeedOptions def process_feed(client, query, start_ts=None, sleep_time=300): cursor = None while True: options = FeedOptions( start_ts=start_ts if cursor is None else None, cursor=cursor, ) feed = client.feed(query, options) for page in feed: for event in page: event_type = event['type'] if event_type == 'add': # Do something on add print('Add event: ', event) elif event_type == 'update': # Do something on update print('Update event: ', event) elif event_type == 'remove': # Do something on remove print('Remove event: ', event) # Store the cursor of the last page cursor = page.cursor # Clear the start timestamp after the first request start_ts = None print(f"Sleeping for {sleep_time} seconds...") time.sleep(sleep_time) client = Client() query = fql('Product.all().eventsOn(.price, .stock)') # Calculate timestamp for 10 minutes ago ten_minutes_ago = datetime.now() - timedelta(minutes=10) start_ts = int(ten_minutes_ago.timestamp() * 1_000_000) process_feed(client, query, start_ts=start_ts) ``` Alternatively, you can get events as a single, flat array: ```python import time from datetime import datetime, timedelta from fauna import fql from fauna.client import Client, FeedOptions def process_feed(client, query, start_ts=None, sleep_time=300): cursor = None while True: options = FeedOptions( start_ts=start_ts if cursor is None else None, cursor=cursor, ) feed = client.feed(query, options) for event in feed.flatten(): event_type = event['type'] if event_type == 'add': # Do something on add print('Add event: ', event) elif event_type == 'update': # Do something on update print('Update event: ', event) elif event_type == 'remove': # Do something on remove print('Remove event: ', event) # Store the cursor of the last page cursor = event['cursor'] # Clear the start timestamp after the first request start_ts = None print(f"Sleeping for {sleep_time} seconds...") time.sleep(sleep_time) client = Client() query = fql('Product.all().eventsOn(.price, .stock)') # Calculate timestamp for 10 minutes ago ten_minutes_ago = datetime.now() - timedelta(minutes=10) start_ts = int(ten_minutes_ago.timestamp() * 1_000_000) process_feed(client, query, start_ts=start_ts) ``` If needed, you can store the cursor as a collection document. For an example, see the [event feeds app](../../sample-apps/event-feeds/). ### [](#error-handling)Error handling If a non-retryable error occurs when opening or processing an event feed, Fauna raises a `FaunaException`: ```python from fauna import fql from fauna.client import Client from fauna.errors import FaunaException client = Client() # Calculate timestamp for 10 minutes ago ten_minutes_ago = datetime.now() - timedelta(minutes=10) start_ts = int(ten_minutes_ago.timestamp() * 1_000_000) options = FeedOptions( start_ts=start_ts ) feed = client.feed(fql( 'Product.all().eventsOn(.price, .stock)' ), options) for page in feed: try: for event in page: print(event) # ... except FaunaException as e: print('error ocurred with event processing: ', e) # The current event will be skipped ``` Each page’s `cursor` contains the cursor for the page’s last successfully processed event. If you’re using a [loop to poll for changes](#loop), using the cursor will result in skipping any events that caused errors. ### [](#event-feed-opts)Event feed options The client configuration sets default options for the `feed()` method. You can pass a `FeedOptions` object to override these defaults: ```python from fauna import fql from fauna.client import Client, FeedOptions from datetime import timedelta client = Client() options = FeedOptions( max_attempts=3, max_backoff=20, query_timeout=timedelta(seconds=5), page_size=None, cursor=None, start_ts=None, ) client.feed(fql('Product.all().eventSource()'), options) ``` For supported properties, see [FeedOptions](https://fauna.github.io/fauna-python/latest/api/fauna/client/client.html#FeedOptions) in the API reference. ### [](#sample-app-2)Sample app For a practical example that uses the Python driver with event feeds, check out the [event feeds sample app](../../sample-apps/event-feeds/). ## [](#event-streaming)Event streams The driver supports [event streams](../../../learn/cdc/). ### [](#start-a-stream)Start a stream To get an event source, append [`set.eventSource()`](../../../reference/fql-api/set/eventsource/) or [`set.eventsOn()`](../../../reference/fql-api/set/eventson/) to a [supported Set](../../../learn/cdc/#sets). To stream the source’s events, pass the event source to `stream()`: ```python import fauna from fauna import fql from fauna.client import Client, StreamOptions client = Client() response = client.query(fql(''' let set = Product.all() { initialPage: set.pageSize(10), eventSource: set.eventSource() } ''')) initial_page = response.data['initialPage'] event_source = response.data['eventSource'] client.stream(event_source) ``` You can also pass a query that produces an event source directly to `stream()`: ```python query = fql('Product.all().eventsOn(.price, .stock)') client.stream(query) ``` ### [](#iterate-on-a-stream)Iterate on a stream `stream()` returns an iterator that emits events as they occur. You can use a generator expression to iterate through the events: ```python query = fql('Product.all().eventsOn(.price, .stock)') with client.stream(query) as stream: for event in stream: eventType = event['type'] if (eventType == 'add'): print('Add event: ', event) ## ... elif (eventType == 'update'): print('Update event: ', event) ## ... elif (eventType == 'remove'): print('Remove event: ', event) ## ... ``` ### [](#close-a-stream)Close a stream Use `close()` to close a stream: ```python query = fql('Product.all().eventsOn(.price, .stock)') count = 0 with client.stream(query) as stream: for event in stream: print('Stream event', event) # ... count+=1 if (count == 2): stream.close() ``` ### [](#error-handling-2)Error handling If a non-retryable error occurs when opening or processing a stream, Fauna raises a `FaunaException`: ```python import fauna from fauna import fql from fauna.client import Client from fauna.errors import FaunaException client = Client(secret='FAUNA_SECRET') try: with client.stream(fql( 'Product.all().eventsOn(.price, .stock)' )) as stream: for event in stream: print(event) # ... except FaunaException as e: print('error ocurred with stream: ', e) ``` ### [](#stream-options)Stream options The [Client configuration](#config) sets default options for the `stream()` method. You can pass a `StreamOptions` object to override these defaults: ```python options = StreamOptions( max_attempts=5, max_backoff=1, start_ts=1710968002310000, status_events=True ) client.stream(fql('Product.all().eventSource()'), options) ``` For supported properties, see [StreamOptions](https://fauna.github.io/fauna-python/latest/api/fauna/client/client.html#StreamOptions) in the API reference. ## [](#debug-logging)Debug logging Logging is handled using Python’s standard `logging` package under the `fauna` namespace. Logs include the HTTP request with body (excluding the `Authorization` header) and the full HTTP response. To enable logging: ```python import logging from fauna.client import Client from fauna import fql logging.basicConfig( level=logging.DEBUG ) client = Client() client.query(fql('42')) ``` For configuration options or to set specific log levels, see Python’s [Logging HOWTO](https://docs.python.org/3/howto/logging.html). # Fauna v10 Go client driver (current) | Version: 3.0.2 | Repository: fauna/fauna-go | | --- | --- | --- | --- | Fauna’s Go client driver lets you run FQL queries from Go applications. This guide shows how to set up the driver and use it to run FQL queries. ## [](#supported-go-versions)Supported Go versions * 1.19 * 1.20 * 1.21 * 1.22 ## [](#supported-cloud-runtimes)Supported cloud runtimes * AWS Lambda (See [AWS Lambda connections](#aws-lambda-connections)) * Netlify Functions * Vercel Functions ## [](#api-reference)API reference API reference documentation for the driver is available on [pkg.go.dev](https://pkg.go.dev/github.com/fauna/fauna-go/v3#section-documentation). ## [](#installation)Installation To install the driver, run: ```bash go get github.com/fauna/fauna-go/v3 ``` ## [](#basic-usage)Basic usage The following application: * Initializes a client instance to connect to Fauna * Composes a basic FQL query using an `FQL` string template * Runs the query using `Query()` ```go package main import ( "fmt" "github.com/fauna/fauna-go/v3" ) func main() { // Initialize the client to connect to Fauna client := fauna.NewClient( "FAUNA_SECRET", fauna.DefaultTimeouts(), ) // Compose a query query, _ := fauna.FQL(` Product.sortedByPriceLowToHigh() { name, description, price } `, nil) res, err := client.Query(query) if err != nil { panic(err) } jsonData, _ := json.Marshal(res.Data) fmt.Println(string(jsonData)) } ``` ## [](#connect-to-fauna)Connect to Fauna Each Fauna query is an independently authenticated request to the Core HTTP API’s [Query endpoint](../../../reference/http/reference/core-api/#operation/query). You authenticate with Fauna using an [authentication secret](../../../learn/security/authentication/#secrets). ### [](#get-an-authentication-secret)Get an authentication secret Fauna supports several [secret types](../../../learn/security/authentication/#secret-types). For testing, you can create a [key](../../../learn/security/keys/), which is a type of secret: 1. Log in to the [Fauna Dashboard](https://dashboard.fauna.com/). 2. On the **Explorer** page, create a database. 3. In the database’s **Keys** tab, click **Create Key**. 4. Choose a **Role** of **server**. 5. Click **Save**. 6. Copy the **Key Secret**. The secret is scoped to the database. ### [](#initialize-a-client)Initialize a client To send query requests to Fauna, initialize a `Client` instance. The `NewDefaultClient()` method initializes a client using: * A Fauna authentication secret in the `FAUNA_SECRET` environment variable * A base URL used by the driver for [Fauna Core HTTP API](../../../reference/http/reference/core-api/) requests in the `FAUNA_ENDPOINT` environment variable. * Default client configuration options ```go client, clientErr := fauna.NewDefaultClient() if clientErr != nil { panic(clientErr) } ``` To pass configuration options, use `NewClient()` to initialize the client: ```go client := fauna.NewClient( "FAUNA_SECRET", fauna.DefaultTimeouts(), ) ``` `NewClient()` requires `secret` and `timeouts` arguments. For timeouts and more configuration options, see [Client configuration](#config). ### [](#connect-to-a-child-database)Connect to a child database A [scoped key](../../../learn/security/keys/#scoped-keys) lets you use a parent database’s admin key to send query requests to its child databases. For example, if you have an admin key for a parent database and want to connect to a child database named `childDB`, you can create a scoped key using the following format: ``` // Scoped key that impersonates an `admin` key for // the `childDB` child database. fn...:childDB:admin ``` You can then initialize a `Client` instance using the scoped key: ```go client := fauna.NewClient( "fn...:childDB:admin", fauna.DefaultTimeouts(), ) ``` ### [](#multiple-connections)Multiple connections You can use a single client instance to run multiple asynchronous queries at once. The driver manages HTTP connections as needed. Your app doesn’t need to implement connection pools or other connection management strategies. You can create multiple client instances to connect to Fauna using different credentials or client configurations. ### [](#aws-lambda-connections)AWS Lambda connections AWS Lambda freezes, thaws, and reuses execution environments for Lambda functions. See [Lambda execution environment](https://docs.aws.amazon.com/lambda/latest/dg/running-lambda-code.html). When an execution environment is thawed, Lambda only runs the function’s handler code. Objects declared outside of the handler method remain initialized from before the freeze. Lambda doesn’t re-run initialization code outside the handler. Fauna drivers keep socket connections that can time out during long freezes, causing `ECONNRESET` errors when thawed. To prevent timeouts, create Fauna client connections inside function handlers. Fauna drivers use lightweight HTTP connections. You can create new connections for each request while maintaining good performance. ## [](#run-fql-queries)Run FQL queries Use `FQL` string templates to compose FQL queries. Run the queries using `Query()`: ```go query, _ := fauna.FQL(`Product.sortedByPriceLowToHigh()`, nil) client.Query(query) ``` By default, `Query()` uses query options from the [Client configuration](#config). You can pass options to `Query()` to override these defaults. See [Query options](#query-opts). You can only compose FQL queries using string templates. ### [](#var)Variable interpolation Use `${}` to pass native Go variables as `map[string]any` to `FQL`. You can escape a variable by prepending an additional `$`. ```go // Create a native Go var collectionName := "Product" // Pass the var to an FQL query query, _ := fauna.FQL(` let collection = Collection(${collectionName}) collection.sortedByPriceLowToHigh() `, map[string]any{"collectionName": collectionName}) client.Query(query) ``` The driver encodes interpolated variables to an appropriate [FQL type](../../../reference/fql/types/) and uses the [wire protocol](../../../reference/http/reference/wire-protocol/) to pass the query to the Core HTTP API’s [Query endpoint](../../../reference/http/reference/core-api/#operation/query). This helps prevent injection attacks. ### [](#query-composition)Query composition You can use variable interpolation to pass FQL string templates as query fragments to compose an FQL query: ```go func main() { client := fauna.NewClient( "FAUNA_SECRET", fauna.DefaultTimeouts(), ) // Create a reusable query fragment. productQuery, _ := fauna.FQL(` Product.byName("pizza").first() `, nil) // Use the fragment in another FQL query. query, _ := fauna.FQL(` let product = ${product} product { name, price } `, map[string]any{"product": productQuery}) client.Query(query) } ``` ### [](#structs)Structs The driver supports user-defined structs: ```go type Product struct { Name string `fauna:"name"` Description string `fauna:"description"` Price int `fauna:"price"` } func main() { client := fauna.NewClient( "FAUNA_SECRET", fauna.DefaultTimeouts(), ) newProduct := Product{"key limes", "Organic, 1 ct", 95} query, _ := fauna.FQL(`Product.create(${product})`, map[string]any{"product": newProduct}) client.Query(query) } ``` ### [](#pagination)Pagination Use `Paginate()` to iterate through a Set that contains more than one page of results. `Paginate()` accepts the same [Query options](#query-opts) as `Query()`. ```go // Adjust `pageSize()` size as needed. query, _ := fauna.FQL(` Product .byName("limes") .pageSize(2) `, nil) paginator := client.Paginate(query) for { page, _ := paginator.Next() var pageItems []any page.Unmarshal(&pageItems) for _, item := range pageItems { fmt.Println(item) } if !paginator.HasNext() { break } } ``` ### [](#query-stats)Query stats Successful query responses and the following error types include [query stats](../../../reference/http/reference/query-stats/): * `ErrAbort` * `ErrAuthentication` * `ErrAuthorization` * `ErrContendedTransaction` * `ErrInvalidRequest` * `ErrQueryCheck` * `ErrQueryRuntime` * `ErrQueryRuntime` * `ErrQueryTimeout` * `ErrServiceInternal` * `ErrServiceTimeout` * `ErrThrottling` ```go query, _ := fauna.FQL(`"Hello world"`, nil) res, err := client.Query(query) if err != nil { if faunaErr, ok := err.(*fauna.ErrQueryCheck); ok { jsonData, _ := json.Marshal(faunaErr.QueryInfo.Stats) fmt.Println(string(jsonData)) } panic(err) } jsonData, _ := json.Marshal(res.Stats) fmt.Println(string(jsonData)) ``` ## [](#config)Client configuration The `Client` instance comes with reasonable configuration defaults. We recommend using the defaults in most cases. If needed, you can use `NewClient()` to configure the client and override the defaults. This also lets you set default [Query options](#query-opts). ```go secret := "FAUNA_SECRET" timeouts := fauna.Timeouts{ QueryTimeout: time.Minute, ClientBufferTimeout: time.Second * 30, ConnectionTimeout: time.Second * 10, IdleConnectionTimeout: time.Minute * 5, } client := fauna.NewClient( // Configure the client secret, timeouts, fauna.URL("https://db.fauna.com"), fauna.AdditionalHeaders(map[string]string{ "foo": "bar", }), fauna.Linearized(false), fauna.MaxAttempts(5), fauna.MaxBackoff(time.Minute), fauna.MaxContentionRetries(5), // Set default query options fauna.DefaultTypecheck(true), fauna.QueryTags(map[string]string{ "tag", "value", }), fauna.QueryTimeout(time.Second*60), ) ``` For supported parameters, see [NewClient](https://pkg.go.dev/github.com/fauna/fauna-go/v3#NewClient) in the API reference. ### [](#timeouts)Timeouts `NewClient()` requires a `timeouts` argument. The argument must contain a `Timeouts` struct: ```go timeouts := fauna.Timeouts{ QueryTimeout: time.Second * 5, ClientBufferTimeout: time.Second * 5, ConnectionTimeout: time.Second * 5, IdleConnectionTimeout: time.Second * 5, } client := fauna.NewClient( "FAUNA_SECRET", timeouts, ) ``` For default timeouts, use `DefaultTimeouts()`: ```go client := fauna.NewClient( "FAUNA_SECRET", fauna.DefaultTimeouts(), ) ``` For supported fields, see [Timeouts](https://pkg.go.dev/github.com/fauna/fauna-go/v3#Timeouts) in the API reference. ### [](#configuration-functions)Configuration functions To configure the client and set default query options, pass one or more `ClientConfigFn` functions to `NewClient()`: ```go client := fauna.NewClient( "FAUNA_SECRET", fauna.DefaultTimeouts(), // Start configuration functions fauna.URL("https://db.fauna.com"), fauna.AdditionalHeaders(map[string]string{ "foo": "bar", }), fauna.Linearized(false), fauna.MaxAttempts(5), fauna.MaxBackoff(time.Minute), fauna.MaxContentionRetries(5), // Configuration functions for // default query options fauna.DefaultTypecheck(true), fauna.QueryTags(map[string]string{ "tag", "value", }), fauna.QueryTimeout(time.Second*60), ) ``` For supported functions, see [ClientConfigFn](https://pkg.go.dev/github.com/fauna/fauna-go/v3#ClientConfigFn) in the API reference. ### [](#retries)Retries By default, the client automatically retries a query if the request returns a 429 HTTP status code. Retries use an exponential backoff. Use the `MaxBackoff` [configuration function](#configuration-functions) to set the maximum time between retries. Similarly, use `MaxAttempts` to set the maximum number of retry attempts, including the initial request. ## [](#query-opts)Query options The [Client configuration](#config) sets default query options for the following methods: * `Query()` * `Paginate()` * `Stream()` To override these defaults, pass one or more `QueryOptFn` functions to the method: ```go options := []fauna.QueryOptFn{ fauna.Tags(map[string]string{ "name": "hello world query", }), fauna.Timeout(time.Minute), fauna.Traceparent("00-750efa5fb6a131eb2cf4db39f28366cb-000000000000000b-00"), fauna.Typecheck(true), } query, _ := fauna.FQL(`"Hello world"`, nil) client.Query(query, options...) ``` For supported functions, see [QueryOptFn](https://pkg.go.dev/github.com/fauna/fauna-go/v3#QueryOptFn) in the API reference. ## [](#event-feeds)Event feeds The driver supports [event feeds](../../../learn/cdc/#event-feeds). An event feed asynchronously polls an [event source](../../../learn/cdc/) for events. To use event feeds, you must have a Pro or Enterprise plan. ### [](#request-an-event-feed)Request an event feed To get an event source, append [`set.eventSource()`](../../../reference/fql-api/set/eventsource/) or [`set.eventsOn()`](../../../reference/fql-api/set/eventson/) to a [supported Set](../../../learn/cdc/#sets). To get paginated events, pass the event source to `Feed()`. This lets you fetch a page of initial results followed by an event feed: ```go client := fauna.NewClient( "FAUNA_SECRET", fauna.DefaultTimeouts(), ) query, _ := fauna.FQL(` let set = Product.all() { initialPage: set.pageSize(10), eventSource: set.eventSource() } `, nil) res, err := client.Query(query) if err != nil { log.Fatalf("Failed to query: %v", err) } var result struct { InitialPage fauna.Page `fauna:"initialPage"` EventSource fauna.EventSource `fauna:"eventSource"` } if err := res.Unmarshal(&result); err != nil { log.Fatalf("Failed to unmarshal results: %v", err) } feed, err := client.Feed(result.EventSource) if err != nil { log.Fatalf("Failed to create feed: %v", err) } ``` If changes occur between the creation of the event source and the `Feed()` request, the feed replays and emits any related events. You can also pass a query that produces an event source directly to `FeedFromQuery()`: ```go query, _ := fauna.FQL(`Product.all().eventsOn(.price, .stock)`, nil) feed, err := client.FeedFromQuery(query) if err != nil { log.Fatalf("Failed to create feed from query: %v", err) } ``` In most cases, you’ll get events after a specific [start time](#start-time) or [cursor](#cursor). #### [](#start-time)Get events after a specific start time When you first poll an event source using an event feed, you usually pass `EventFeedStartTime()` to `Feed()` or `FeedFromQuery()`. The request returns events that occurred after the specified timestamp (exclusive): ```go query, _ := fauna.FQL(`Product.all().eventSource()`, nil) // Calculate timestamp for 10 minutes ago tenMinutesAgo := time.Now().Add(-10 * time.Minute) feed, err := client.FeedFromQuery( query, fauna.EventFeedStartTime(tenMinutesAgo), ) ``` The start time must be later than the creation time of the event source. The period between the request and the start time can’t exceed the `history_days` setting for the source Set’s collection. If `history_days` is `0` or unset, the period is limited to 15 minutes. #### [](#cursor)Get events after a specific cursor After the initial request, you usually get subsequent events using the [cursor](../../../learn/cdc/#cursor) for the last page or event. To get events after a cursor (exclusive), pass an `EventFeedCursor()` to `Feed()` or `FeedFromQuery()`: ```go query, _ := fauna.FQL(`Product.all().eventSource()`, nil) feed, err := client.FeedFromQuery( query, fauna.EventFeedCursor("gsGabc456"), // Cursor for a previous page ) ``` ### [](#loop)Iterate on an event feed `Feed()` and `FeedFromQuery()` return an `EventFeed` instance. You can use a `for` loop to iterate through the pages of events: ```go import ( "fmt" "log" "time" "github.com/fauna/fauna-go/v3" ) func main() { client := fauna.NewClient("FAUNA_SECRET", fauna.DefaultTimeouts()) query, _ := fauna.FQL(`Product.all().eventSource()`, nil) // Calculate timestamp for 10 minutes ago tenMinutesAgo := time.Now().Add(-10 * time.Minute) feed, err := client.FeedFromQuery( query, fauna.EventFeedStartTime(tenMinutesAgo), ) if err != nil { log.Fatalf("Failed to create feed: %v", err) } for { var page fauna.FeedPage err := feed.Next(&page) if err != nil { log.Fatalf("Error getting next feed page: %v", err) } fmt.Println("Page stats:", page.Stats) for _, event := range page.Events { switch event.Type { case "add": // Do something on add fmt.Println("Add event: ", event) case "update": // Do something on update fmt.Println("Update event: ", event) case "remove": // Do something on remove fmt.Println("Remove event: ", event) } } if !page.HasNext { break } } } ``` Each page includes a top-level `cursor`. You can pass the cursor to `Feed()` or `FeedFromQuery()` using `EventFeedCursor()`: ```go import ( "fmt" "log" "time" "github.com/fauna/fauna-go/v3" ) func processFeed(client *fauna.Client, query *fauna.Query, startTs time.Time, sleepTime time.Duration) { var cursor string = "" for { // Use start time only on the first request, then use cursor. var options []fauna.FeedOptFn if !startTs.IsZero() { options = append(options, fauna.EventFeedStartTime(startTs)) // Null out startTs after first use. startTs = time.Time{} } else { options = append(options, fauna.EventFeedCursor(cursor)) } feed, err := client.FeedFromQuery(query, options...) if err != nil { log.Fatalf("Failed to create feed: %v", err) } for { var page fauna.FeedPage err := feed.Next(&page) if err != nil { log.Fatalf("Error getting next feed page: %v", err) } for _, event := range page.Events { switch event.Type { case "add": fmt.Println("Add event:", event) case "update": fmt.Println("Update event:", event) case "remove": fmt.Println("Remove event:", event) } } // Store the cursor of the last page cursor = page.Cursor // If no more pages are available, break the inner loop if !page.HasNext { break } } // Sleep between feed requests fmt.Printf("Sleeping for %v seconds...\n", sleepTime.Seconds()) time.Sleep(sleepTime) } } func main() { client := fauna.NewClient("FAUNA_SECRET", fauna.DefaultTimeouts()) // Calculate timestamp for 10 minutes ago tenMinutesAgo := time.Now().Add(-10 * time.Minute) query, err := fauna.FQL(`Product.all().eventsOn(.price, .stock)`, nil) if err != nil { log.Fatalf("Failed to create FQL query: %v", err) } sleepTime := 300 * time.Second processFeed(client, query, tenMinutesAgo, sleepTime) } ``` If needed, you can store the cursor as a collection document: ```go import ( "fmt" "log" "time" "github.com/fauna/fauna-go/v3" ) func processFeedWithCursor(client *fauna.Client, query *fauna.Query, startTs time.Time, sleepTime time.Duration) { // Ensure `Cursor` collection exists createCursorCollection, err := fauna.FQL(` if (Collection.byName("Cursor").exists() == false) { Collection.create({ name: "Cursor", fields: { name: { signature: "String" }, value: { signature: "String?" } }, constraints: [ { unique: [ { field: ".name", mva: false } ] } ], indexes: { byName: { terms: [ { field: ".name", mva: false } ] } } }) } else { null } `, nil) if err != nil { log.Fatalf("Failed to create Cursor collection: %v", err) } if _, err := client.Query(createCursorCollection); err != nil { log.Fatalf("Failed to create Cursor collection: %v", err) } // Ensure `ProductInventory` document exists in `Cursor` createProductInventoryCursor, err := fauna.FQL(` if (Collection("Cursor").byName("ProductInventory").first() == null) { Cursor.create({ name: "ProductInventory", value: null }) } else { null } `, nil) if err != nil { log.Fatalf("Failed to create ProductInventory cursor: %v", err) } if _, err := client.Query(createProductInventoryCursor); err != nil { log.Fatalf("Failed to create ProductInventory cursor: %v", err) } for { // Fetch existing cursor from the `Cursor` collection cursorQuery, err := fauna.FQL(`Cursor.byName("ProductInventory").first()`, nil) if err != nil { log.Fatalf("Failed to create cursor query: %v", err) } cursorRes, err := client.Query(cursorQuery) if err != nil { log.Fatalf("Failed to fetch cursor: %v", err) } // Unmarshal cursor data into a map var cursorData map[string]interface{} if err := cursorRes.Unmarshal(&cursorData); err != nil { log.Fatalf("Failed to unmarshal cursor result: %v", err) } // Extract the cursor value cursor, _ := cursorData["cursor"].(string) // Set options based on cursor existence var options []fauna.FeedOptFn if cursor == "" { options = append(options, fauna.EventFeedStartTime(startTs)) } else { // Here we ensure that the query supports cursors if query == nil { log.Fatalf("Query is nil; unable to create feed with cursor.") } options = append(options, fauna.EventFeedCursor(cursor)) } // Create the feed feed, err := client.FeedFromQuery(query, options...) if err != nil { log.Fatalf("Failed to create feed: %v", err) } for { var page fauna.FeedPage if err := feed.Next(&page); err != nil { log.Fatalf("Error getting next feed page: %v", err) } for _, event := range page.Events { switch event.Type { case "add": fmt.Println("Add event: ", event) case "update": fmt.Println("Update event: ", event) case "remove": fmt.Println("Remove event: ", event) } } // Store the cursor of the last page in the `Cursor` collection cursor = page.Cursor updateCursor, err := fauna.FQL(fmt.Sprintf(` Cursor.byName("ProductInventory").first()!.update({ value: "%s" }) `, cursor), nil) if err != nil { log.Fatalf("Failed to create update cursor query: %v", err) } if _, err := client.Query(updateCursor); err != nil { log.Fatalf("Failed to update cursor: %v", err) } fmt.Printf("Cursor updated: %s\n", cursor) startTs = time.Time{} fmt.Printf("Sleeping for %v seconds...\n", sleepTime.Seconds()) time.Sleep(sleepTime) } } } func main() { client := fauna.NewClient("FAUNA_SECRET", fauna.DefaultTimeouts()) // Calculate timestamp for 10 minutes ago tenMinutesAgo := time.Now().Add(-10 * time.Minute) query, err := fauna.FQL(`Product.all().eventsOn(.price, .stock)`, nil) if err != nil { log.Fatalf("Failed to create FQL query: %v", err) } sleepTime := 300 * time.Second processFeedWithCursor(client, query, tenMinutesAgo, sleepTime) } ``` ### [](#error-handling)Error handling Errors can occur in two places: * While fetching a page * While iterating a page’s events This distinction allows for you to ignore errors originating from event processing. For example: ```go import ( "fmt" "log" "time" "github.com/fauna/fauna-go/v3" ) func main() { client := fauna.NewClient("FAUNA_SECRET", fauna.DefaultTimeouts()) query, _ := fauna.FQL(`Product.all().eventSource()`, nil) // Calculate timestamp for 10 minutes ago tenMinutesAgo := time.Now().Add(-10 * time.Minute) feed, err := client.FeedFromQuery( query, fauna.EventFeedStartTime(tenMinutesAgo), ) if err != nil { log.Fatalf("Failed to create feed: %v", err) } for { var page fauna.FeedPage err := feed.Next(&page) if err != nil { log.Fatalf("Error getting next feed page: %v", err) } fmt.Println("Page stats:", page.Stats) for _, event := range page.Events { func() { defer func() { if r := recover(); r != nil { log.Printf("Error processing event: %v", r) } }() switch event.Type { case "add": fmt.Println("Add event: ", event) case "update": fmt.Println("Update event: ", event) case "remove": fmt.Println("Remove event: ", event) default: log.Printf("Unknown event type: %s", event.Type) } }() } if !page.HasNext { break } } } ``` Each page’s `cursor` contains the cursor for the page’s last successfully processed event. If you’re using a [loop to poll for changes](#loop), you can use the cursor to skip any events that caused errors. ### [](#event-feed-opts)Event feed options Both `Feed()` and `FeedFromQuery()` accept [FeedOptFn](https://pkg.go.dev/github.com/fauna/fauna-go/v3#FeedOptFn) functions as arguments. Use `EventFeedStartTime()` to start the feed at a specific timestamp: ```go tenMinutesAgo := time.Now().Add(-10 * time.Minute) feed := client.FeedFromQuery( fauna.FQL(`Product.all().eventSource()`), fauna.EventFeedStartTime(tenMinutesAgo), ) ``` Use `EventFeedCursor()` to start the feed at a specific event or page cursor: ```go feed := client.FeedFromQuery( fauna.FQL(`Product.all().eventSource()`), fauna.EventFeedCursor("gsGabc456"), ) ``` Use `EventFeedPageSize()` to set the maximum number of events returned per page: ```go feed := client.FeedFromQuery( fauna.FQL(`Product.all().eventSource()`), fauna.EventFeedCursor("gsGabc456"), fauna.EventFeedPageSize(10), ) ``` For supported functions, see [FeedOptFn](https://pkg.go.dev/github.com/fauna/fauna-go/v3#FeedOptFn) in the API reference. ## [](#event-streaming)Event streams The driver supports [event streams](../../../learn/cdc/). ### [](#start-a-stream)Start a stream To get an event source, append [`set.eventSource()`](../../../reference/fql-api/set/eventsource/) or [`set.eventsOn()`](../../../reference/fql-api/set/eventson/) to a [supported Set](../../../learn/cdc/#sets). The driver represents event sources as `EventSource` values. To stream the source’s events, pass the event source to `Stream()`. This lets you output a stream alongside normal query results: ```go type Product struct { Name string `fauna:"name"` Description string `fauna:"description"` Price int `fauna:"price"` } func main() { client := fauna.NewClient( "FAUNA_SECRET", fauna.DefaultTimeouts(), ) dataLoad, _ := fauna.FQL(` let products = Product.all() { Products: products.toArray(), EventSource: products.eventSource() } `, nil) data, err := client.Query(dataLoad) if err != nil { panic(err) } queryResult := struct { Products []Product EventSource fauna.EventSource }{} if err := data.Unmarshal(&queryResult); err != nil { panic(err) } fmt.Println("Existing products:") for _, product := range queryResult.Products { fmt.Println(product) } events, err := client.Stream(queryResult.EventSource) if err != nil { panic(err) } defer events.Close() fmt.Println("Products from streaming:") var event fauna.Event for { err := events.Next(&event) if err != nil { panic(err) } switch event.Type { case fauna.AddEvent, fauna.UpdateEvent, fauna.RemoveEvent: var product Product if err = event.Unmarshal(&product); err != nil { panic(err) } fmt.Println(product) } } } ``` You can also pass a query that produces an event source directly to `StreamFromQuery()`. ```go type Product struct { Name string `fauna:"name"` Description string `fauna:"description"` Price int `fauna:"price"` } func main() { client := fauna.NewClient( "FAUNA_SECRET", fauna.DefaultTimeouts(), ) streamQuery, _ := fauna.FQL("Product.all().eventSource()", nil) events, err := client.Stream(streamQuery) if err != nil { panic(err) } defer events.Close() var event fauna.Event for { err := events.Next(&event) if err != nil { panic(err) } switch event.Type { case fauna.AddEvent, fauna.UpdateEvent, fauna.RemoveEvent: var product Product if err = event.Unmarshal(&product); err != nil { panic(err) } fmt.Println(product) } } } ``` ### [](#stream-options)Stream options Both `Stream()` and `StreamFromQuery()` accept [StreamOptFn](https://pkg.go.dev/github.com/fauna/fauna-go/v3#StreamOptFn) functions as arguments. Use `StreamStartTime()` to restart a stream at a specific timestamp: ```go streamQuery, _ := fauna.FQL(`Product.all().eventSource()`, nil) tenMinutesAgo := time.Now().Add(-10 * time.Minute) client.StreamFromQuery(streamQuery, nil, fauna.StreamStartTime(tenMinutesAgo)) ``` Use `EventCursor()` to resume a stream after a disconnect: ```go streamQuery, _ := fauna.FQL(`Product.all().toStream()`, nil) client.StreamFromQuery(streamQuery, nil, fauna.EventCursor("abc2345==")) ``` For supported functions, see [StreamOptFn](https://pkg.go.dev/github.com/fauna/fauna-go/v3#StreamOptFn) in the API reference. ## [](#debug-logging)Debug logging To enable debug logging, set the `FAUNA_DEBUG` environment variable to an integer for the value of the desired [slog level](https://pkg.go.dev/log/slog#Level): * `slog.LevelInfo` logs all HTTP responses from Fauna. * `slog.LevelDebug` includes the HTTP request body. The `Authorization` header is not redacted. For Go versions before 1.21, the driver uses a [log.Logger](https://pkg.go.dev/log#Logger). For 1.21+, the driver uses the [slog.Logger](https://pkg.go.dev/log/slog#Logger). You can optionally define your own Logger. For an example, see `CustomLogger` in [`logging_slog_test.go`](https://github.com/fauna/fauna-go/blob/main/logging_slog_test.go)\`. # Fauna v10 .NET/C# client driver (current) | Version: 1.0.1 | Repository: fauna/fauna-dotnet | | --- | --- | --- | --- | Fauna’s .NET/C# client driver lets you run FQL queries from .NET and C# applications. This guide shows how to set up the driver and use it to run FQL queries. ## [](#supported-net-and-c-versions)Supported .NET and C# versions * .NET 8.0 * C# ^10.0 ## [](#installation)Installation The driver is available on [NuGet](https://www.nuget.org/packages/fauna). To install it using the .NET CLI, run: ```csharp dotnet add package Fauna ``` ## [](#api-reference)API reference API reference documentation for the driver is available at [https://fauna.github.io/fauna-dotnet/](https://fauna.github.io/fauna-dotnet/). ## [](#sample-app)Sample app For a practical example, check out the [.NET sample app](https://github.com/fauna/dotnet-sample-app). This sample app is an e-commerce application that uses the Fauna .NET/C# driver. The source code includes comments highlighting best practices for using the driver and composing FQL queries. ## [](#basic-usage)Basic usage The following applications: * Initialize a client instance to connect to Fauna * Compose a basic FQL query using an `FQL` string template * Run the query using `QueryAsync()` or `PaginateAsync()` * Deserialize the results based on a provided type parameter Use `QueryAsync()` to run a non-paginated query: ```csharp using Fauna; using Fauna.Exceptions; using static Fauna.Query; try { // Initialize the client to connect to Fauna var config = new Configuration("FAUNA_SECRET") var client = new Client(config); // Compose a query var query = FQL($@" Product.byName('cups').first() {{ name, description, price }} "); // Run the query // Optionally specify the expected result type as a type parameter. // If not provided, the value will be deserialized as object. var response = await client.QueryAsync>(query); Console.WriteLine(response.Data["name"]); Console.WriteLine(response.Data["description"]); Console.WriteLine(response.Data["price"]); Console.WriteLine("--------"); } catch (FaunaException e) { Console.WriteLine(e); } ``` Queries that return a [Set](../../../reference/fql/types/#set) are automatically paginated. Use `PaginateAsync()` to iterate through paginated results: ```csharp using Fauna; using Fauna.Exceptions; using static Fauna.Query; try { // Initialize the client to connect to Fauna var client = new Client("FAUNA_SECRET"); // Compose a query var query = FQL($@" Product.sortedByPriceLowToHigh() {{ name, description, price }} "); // Run the query // PaginateAsync returns an IAsyncEnumerable of pages var response = client.PaginateAsync>(query); await foreach (var page in response) { foreach (var product in page.Data) { Console.WriteLine(product["name"]); Console.WriteLine(product["description"]); Console.WriteLine(product["price"]); Console.WriteLine("--------"); } } } catch (FaunaException e) { Console.WriteLine(e); } ``` ## [](#connect-to-fauna)Connect to Fauna Each Fauna query is an independently authenticated request to the [Query HTTP API endpoint](../../../reference/http/reference/core-api/#operation/query). You authenticate with Fauna with an [authentication secret](../../../learn/security/authentication/#secrets). ### [](#get-an-authentication-secret)Get an authentication secret Fauna supports several [secret types](../../../learn/security/authentication/#secret-types). For testing, you can create a [key](../../../learn/security/keys/), which is a type of secret: 1. Log in to the [Fauna Dashboard](https://dashboard.fauna.com/). 2. On the **Explorer** page, create a database. 3. In the database’s **Keys** tab, click **Create Key**. 4. Choose a **Role** of **server**. 5. Click **Save**. 6. Copy the **Key Secret**. The secret is scoped to the database. ### [](#initialize-a-client)Initialize a client To send query requests to a Fauna database, initialize a `Client` instance using an authentication secret scoped to the database: ```csharp var client = new Client("FAUNA_SECRET"); ``` `Client` requires a `secret` or `configuration` argument. For configuration options, see [Client configuration](#config). ### [](#connect-to-a-child-database)Connect to a child database A [scoped key](../../../learn/security/keys/#scoped-keys) lets you use a parent database’s admin key to send query requests to its child databases. For example, if you have an admin key for a parent database and want to connect to a child database named `childDB`, you can create a scoped key using the following format: ``` // Scoped key that impersonates an `admin` key for // the `childDB` child database. fn...:childDB:admin ``` You can then initialize a `Client` instance using the scoped key: ```csharp var client = new Client("fn...:childDB:admin"); ``` ### [](#multiple-connections)Multiple connections You can use a single client instance to run multiple asynchronous queries at once. The driver manages HTTP connections as needed. Your app doesn’t need to implement connection pools or other connection management strategies. You can create multiple client instances to connect to Fauna using different credentials or client configurations. ## [](#run-fql-queries)Run FQL queries Use `FQL` string templates to compose FQL queries. Run the queries using `QueryAsync()` or `PaginateAsync()`: ```csharp // Unpaginated query var query = FQL($@"Product.byName('cups').first()"); client.QueryAsync(query); // Paginated query // Adjust `pageSize()` size as needed var paginatedQuery = FQL($@"Category.all().pageSize(2)"); client.PaginateAsync(paginatedQuery); ``` You can only compose FQL queries using string templates. ### [](#var)Variable interpolation Use single braces `{}` to pass native variables to fql queries. Use `{{}}` to escape other single braces in the query. ```csharp // Create a native var var collectionName = "Product"; // Pass the var to an FQL query var query = FQL($@" let collection = Collection({collectionName}) collection.byName('cups').first() {{ price }}" ); client.QueryAsync(query); ``` The driver encodes interpolated variables to an appropriate [FQL type](../../../reference/fql/types/) and uses the [wire protocol](../../../reference/http/reference/wire-protocol/) to pass the query to the Core HTTP API’s [Query endpoint](../../../reference/http/reference/core-api/#operation/query). This helps prevent injection attacks. ### [](#query-composition)Query composition You can use variable interpolation to pass FQL string templates as query fragments to compose an FQL query: ```csharp // Create a reusable query fragment. var product = FQL($@"Product.byName(""pizza"").first()"); // Use the fragment in another FQL query. var query = FQL($@" let product = {product} product {{ name, price }} "); client.QueryAsync(query); ``` ### [](#poco-mapping)POCO mapping With `Fauna.Mapping`, you can map a POCO class to a Fauna document or object shape: ```csharp using Fauna.Mapping; class Category { // Property names are automatically converted to camelCase. [Id] public string? Id { get; set; } // Manually specify a name by providing a string. [Field("name")] public string? CatName { get; set; } } class Product { [Id] public string? Id { get; set; } public string? Name { get; set; } public string? Description { get; set; } public int Price { get; set; } // Reference to document public Ref Category { get; set; } } ``` * `[Id]`: Should only be used once per class on a field named `Id` that represents the Fauna document ID. It’s not encoded unless the `isClientGenerated` flag is true. * `[Ts]`: Should only be used once per class on a field named `Ts` that represents the timestamp of a document. It’s not encoded. * `[Collection]`: Typically goes unmodeled. Should only be used once per class on a field named `Coll` that represents the collection field of a document. It will never be encoded. * `[Field]`: Can be associated with any field to override its name in Fauna. * `[Ignore]`: Can be used to ignore fields during encoding and decoding. You can use POCO classes to deserialize query responses: ```csharp var query = FQL($@"Product.sortedByPriceLowToHigh()"); var products = client.PaginateAsync(query).FlattenAsync(); await foreach (var p in products) { Console.WriteLine($"{p.Name} {p.Description} {p.Price}"); } ``` You can also use POCO classes to write to your database: ```csharp var product = new Product { Id = "12345", Name = "limes", Description = "Organic, 2 ct", Price = 95 }; client.QueryAsync(FQL($@"Product.create({product})")); ``` ### [](#datacontext)`DataContext` The `DataContext` class provides a schema-aware view of your database. Subclass it and configure your collections: ```csharp class CustomerDb : DataContext { public class CustomerCollection : Collection { public Index ByEmail(string email) => Index().Call(email); public Index ByName(string name) => Index().Call(name); } public CustomerCollection Customer { get => GetCollection(); } } ``` `DataContext` provides `Client` querying, which automatically maps your collections to POCO equivalents, even when type hints are not provided. ```csharp var db = client.DataContext var result = db.QueryAsync(FQL($"Customer.all().first()")); var customer = (Customer)result.Data!; Console.WriteLine(customer.name); ``` ### [](#document-references)Document references The driver supports [document references](../../../learn/data-model/relationships/) using the `Ref` type. There are several ways to work with document references using the driver: 1. Fetch the reference without loading the referenced document: ```csharp // Gets a Product document. // The document's `category` field contains a // reference to a Category document. The // `category` field is not projected. var query = FQL($@" Product.byName('limes').first() "); var response = await client.QueryAsync(query); var product = response.Data; ``` 2. [Project](../../../reference/fql/projection/) the document reference to load the referenced document: ```csharp // Gets a Product document. // The `category` field is projected to load the // referenced document. var query = FQL($@" Product.byName('limes').first() { name, category { name } } "); var response = await client.QueryAsync>(query); var product = response.Data; Console.WriteLine(product["name"]); // Prints the category name. var category = (Dictionary)product["category"]; Console.WriteLine(category["name"]); ``` 3. Use `LoadRefAsync()` to load the referenced document: ```csharp // Gets a Product document. var query = FQL($@"Product.byName('limes').first()"); var response = await client.QueryAsync(query); var product = response.Data; // Loads the Category document referenced in // the Product document. var category = await client.LoadRefAsync(product.Category); // Prints the category name. Console.WriteLine(category.Name); ``` If the reference is already loaded, it returns the cached value without making another query to Fauna: ```csharp // This won't run another query if the referenced // document is already loaded. var sameCategory = await client.LoadRefAsync(product.Category); ``` #### [](#null-documents)Null documents A [NullDoc](../../../reference/fql/types/#nulldoc), or null document, can be handled two ways: 1. Let the driver throw an exception and do something with it: ```csharp try { await client.QueryAsync(FQL($"SomeColl.byId('123')")) } catch (NullDocumentException e) { Console.WriteLine(e.Id); // "123" Console.WriteLine(e.Collection.Name); // "SomeColl" Console.WriteLine(e.Cause); // "not found" } ``` 2. Wrap your expected type in a `Ref<>` or `NamedRef`. You can wrap `Dictionary` and POCOs. ```csharp var q = FQL($"Collection.byName('Fake')"); var r = (await client.QueryAsync>>(q)).Data; if (r.Data.Exists) { Console.WriteLine(d.Id); // "Fake" Console.WriteLine(d.Collection.Name); // "Collection" var doc = r.Get(); // A dictionary with id, coll, ts, and any user-defined fields. } else { Console.WriteLine(d.Name); // "Fake" Console.WriteLine(d.Collection.Name); // "Collection" Console.WriteLine(d.Cause); // "not found" r.Get() // this throws a NullDocumentException } ``` ### [](#pagination)Pagination When you wish to paginate a [set](../../../reference/fql/types/#set), such as a collection or index, use `PaginateAsync()`. Example of a query that returns a Set: ```csharp var query = FQL($"Customer.all()"); await foreach (var page in client.PaginateAsync(query)) { // handle each page } await foreach (var item in client.PaginateAsync(query).FlattenAsync()) { // handle each item } ``` Example of a query that returns an object with an embedded Set: ```csharp class MyResult { [Field("customers")] public Page? Customers { get; set; } } var query = FQL($"{{customers: Customer.all()}}"); var result = await client.QueryAsync(query); await foreach (var page in client.PaginateAsync(result.Data.Customers!)) { // handle each page } await foreach (var item in client.PaginateAsync(result.Data.Customers!).FlattenAsync()) { // handle each item } ``` ### [](#query-stats)Query stats Successful query responses and `ServiceException` exceptions include [query stats](../../../reference/http/reference/query-stats/): ```csharp try { var client = new Client("FAUNA_SECRET"); var query = FQL($@"'Hello world'"); var response = await client.QueryAsync(query); Console.WriteLine(response.Stats.ToString()); } catch (FaunaException e) { if (e is ServiceException serviceException) { Console.WriteLine(serviceException.Stats.ToString()); Console.WriteLine(e); } else { Console.WriteLine(e); } } ``` ## [](#config)Client configuration The `Client` instance comes with reasonable configuration defaults. We recommend using the defaults in most cases. If needed, you can configure the client and override the defaults. This also lets you set default [Query options](#query-opts). ```csharp var config = new Configuration("FAUNA_SECRET") { // Configure the client Endpoint = new Uri("https://db.fauna.com"), RetryConfiguration = new RetryConfiguration(3, TimeSpan.FromSeconds(20)), // Set default query options DefaultQueryOptions = new QueryOptions { Linearized = false, QueryTags = new Dictionary { { "tag", "value" } }, QueryTimeout = TimeSpan.FromSeconds(60), TraceParent = "00-750efa5fb6a131eb2cf4db39f28366cb-000000000000000b-00", TypeCheck = false } }; var client = new Client(config); ``` For supported properties, see [Fauna.Configuration](https://fauna.github.io/fauna-dotnet/latest/class_fauna_1_1_configuration.html) in the API reference. ### [](#environment-variables)Environment variables By default, the client configuration’s `Secret` and `Endpoint` default to the respective `FAUNA_SECRET` and `FAUNA_ENDPOINT` environment variables. For example, if you set the following environment variables: ```bash export FAUNA_SECRET=FAUNA_SECRET export FAUNA_ENDPOINT=https://db.fauna.com/ ``` You can initialize the client with a default configuration: ```csharp var client = new Client(); ``` ### [](#retries)Retries By default, the client automatically retries query requests that return a `limit_exceeded` [error code](../../../reference/http/reference/errors/). Retries use an exponential backoff. The client retries a query up to three times by default. The maximum wait time between retries defaults to 20 seconds. To override these defaults, pass a `RetryConfiguration` instance to the [Client configuration](#config). ```csharp var config = new Configuration("FAUNA_SECRET") { RetryConfiguration = new RetryConfiguration(3, TimeSpan.FromSeconds(20)) }; var client = new Client(config); ``` For supported parameters, see [Fauna.Core.RetryConfiguration](https://fauna.github.io/fauna-dotnet/latest/class_fauna_1_1_core_1_1_retry_configuration.html) in the API reference. ## [](#query-opts)Query options The [Client configuration](#config) sets default query options for the following methods: * `QueryAsync()` * `PaginateAsync()` You can pass a `QueryOptions` argument to override these defaults: ```csharp var queryOptions = new QueryOptions { Linearized = false, QueryTags = new Dictionary { { "tag", "value" } }, QueryTimeout = TimeSpan.FromSeconds(60), TraceParent = "00-750efa5fb6a131eb2cf4db39f28366cb-000000000000000b-00", TypeCheck = true }; var query = FQL($@"'Hello world'"); client.QueryAsync(query, queryOptions); ``` For supported properties, see [Fauna.Core.QueryOptions](https://fauna.github.io/fauna-dotnet/latest/class_fauna_1_1_core_1_1_query_options.html) in the API reference. ## [](#event-feeds)Event feeds The driver supports [event feeds](../../../learn/cdc/#event-feeds). An event feed asynchronously polls an [event source](../../../learn/cdc/#create-an-event-source) for paginated events. To use event feeds, you must have a Pro or Enterprise plan. ### [](#request-an-event-feed)Request an event feed To get an event source, append [`set.eventSource()`](../../../reference/fql-api/set/eventsource/) or [`set.eventsOn()`](../../../reference/fql-api/set/eventson/) to a [supported Set](../../../learn/cdc/#sets). To get paginated events, pass the event source to `EventFeedAsync()`: ```csharp // Get an event source from a supported Set EventSource eventSource = await client.QueryAsync(FQL($"Product.all().eventSource()")); var feed = await client.EventFeedAsync(eventSource); ``` If changes occur between the creation of the event source and the event feed request, the feed replays and emits any related events. You can also pass a query that produces an event source directly to `EventFeedAsync()`: ```csharp var feed = await client.EventFeedAsync(FQL($"Product.all().eventSource()")); ``` If you pass an event source query to `EventFeedAsync()`, the driver creates the event source and requests the event feed at the same time. In most cases, you’ll get events after a specific [event cursor](#cursor) or [start time](#start-time). #### [](#start-time)Get events after a specific start time When you first poll an event source using an event feed, you usually include a `startTs` (start timestamp) in the [`FeedOptions` object](#event-feed-opts) that’s passed to `EventFeedAsync()`. The request returns events that occurred after the specified timestamp (exclusive). `startTs` is an integer representing a time in microseconds since the Unix epoch: ```csharp // Calculate timestamp for 10 minutes ago in microseconds long tenMinutesAgo = DateTimeOffset.UtcNow.AddMinutes(-10).ToUnixTimeMilliseconds() * 1000; var feedOptions = new FeedOptions(startTs: tenMinutesAgo); var feed = await client.EventFeedAsync(FQL($"Product.all().eventSource()", feedOptions)); ``` `startTs` must be later than the creation time of the event source. The period between the request and the `startTs` can’t exceed the `history_days` setting for the source Set’s collection. If `history_days` is `0` or unset, the period is limited to 15 minutes. #### [](#cursor)Get events after a specific event cursor After the initial request, you usually get subsequent events using the [cursor](../../../learn/cdc/#cursor) for the last page or event. To get events after a cursor (exclusive), include the `cursor` in the [`FeedOptions` object](#event-feed-opts) that’s passed to `EventFeedAsync()`: ```csharp var feedOptions = new FeedOptions(cursor: "gsGabc456"); // Cursor for a previous page var feed = await client.EventFeedAsync(FQL($"Product.all().eventSource()", feedOptions)); ``` You can reuse cursors across event sources with identical queries in the same database. ### [](#loop)Iterate on an event feed `EventFeedAsync()` returns a `FeedEnumerable` instance that acts as an `AsyncEnumerator`. Use `foreach()` to iterate through the pages of events: ```csharp await foreach (var page in feed) { foreach (var evt in page.Events) { Console.WriteLine($"Event Type: {evt.Type}"); Product product = evt.Data; Console.WriteLine($"Product Name: {product.Name}"); } } ``` The `FeedEnumerable` will stop when there are no more events to poll. Each page includes a top-level `cursor`. You can include the cursor in a [`FeedOptions` object](#event-feed-opts) passed to `EventFeedAsync()` to poll for events after the cursor. ### [](#error-handling)Error handling Exceptions can be raised at two different places: * While fetching a page * While iterating a page’s events This distinction allows for you to ignore errors originating from event processing. For example: ```csharp try { await foreach (var page in feed) { try { foreach (var evt in page.Events) { Console.WriteLine($"Event Type: {evt.Type}"); Product product = evt.Data; Console.WriteLine($"Product Name: {product.Name}"); } } // `EventException` is thrown for event processing errors. catch (EventException eventError) { Console.WriteLine($"Feed event error: {eventError}"); } } } catch (Exception error) { Console.WriteLine($"Non-retryable error: {error}"); } ``` Each page’s `cursor` contains the cursor for the page’s last successfully processed event. If you’re using a [loop to poll for changes](#loop), using the cursor will skip any events that caused errors. ### [](#event-feed-opts)Event feed options The client configuration sets the default options for `EventFeedAsync()`. You can pass a [`FeedOptions`](https://fauna.github.io/fauna-dotnet/latest/class_fauna_1_1_core_1_1_feed_options.html) object to override these defaults: ```csharp var feedOptions = new FeedOptions( startTs: 1710968002310000, pageSize: 10, cursor: "gsGabc456" ); var feed = await client.EventFeedAsync(FQL($"Product.all().eventSource()"), feedOptions); ``` For supported properties, see [`FeedOptions`](https://fauna.github.io/fauna-dotnet/latest/class_fauna_1_1_core_1_1_feed_options.html) in the API reference. ## [](#event-streaming)Event streams The driver supports [event streams](../../../learn/cdc/). ### [](#start-a-stream)Start a stream To get an event source, append [`set.eventSource()`](../../../reference/fql-api/set/eventsource/) or [`set.eventsOn()`](../../../reference/fql-api/set/eventson/) to a [supported Set](../../../learn/cdc/#sets). To stream the source’s events, pass the event source to `SubscribeStream()`: ```csharp var query = fql($@" let set = Customer.all() {{ initialPage: set.pageSize(10), eventSource: set.eventSource() }} "); var response = await client.QueryAsync(query); var eventSource = response["eventSource"].ToString(); await using var stream = client.SubscribeStream(eventSource); await foreach (var evt in stream) { Console.WriteLine($"Received Event Type: {evt.Type}"); if (evt.Data != null) // Status events won't have Data { Customer customer = evt.Data; Console.WriteLine($"Name: {customer.Name} - Email: {customer.Email}"); } } ``` You can also pass a query that produces an event source directly to `EventStreamAsync()`: ```csharp var stream = await client.EventStreamAsync(FQL($"Customer.all().eventSource()")); await foreach (var evt in stream) { Console.WriteLine($"Received Event Type: {evt.Type}"); if (evt.Data != null) { Customer customer = evt.Data; Console.WriteLine($"Name: {customer.Name} - Email: {customer.Email}"); } } ``` ### [](#stream-options)Stream options The [Client configuration](#config) sets default options for the `SubscribeStream()` and `EventStreamAsync()` methods. You can pass a [`StreamOptions`](https://fauna.github.io/fauna-dotnet/latest/class_fauna_1_1_stream_options.html) object to override these defaults: ```csharp var options = new StreamOptions( token: "", cursor: "gsGghu789" ); var stream = await client.EventStreamAsync( query: FQL("Product.all().eventSource()"), streamOptions: options ); await foreach (var evt in stream) { Console.WriteLine($"Received Event Type: {evt.Type}"); if (evt.Data != null) { Customer customer = evt.Data; Console.WriteLine($"Name: {customer.Name} - Email: {customer.Email}"); } } ``` ## [](#debug-logging)Debug logging To enable debug logging, set the `FAUNA_DEBUG` environment variable to an integer for the `Microsoft.Extensions.Logging.LogLevel`. For example: * `0`: `LogLevel.Trace` and higher (all messages) * `3`: `LogLevel.Warning` and higher The driver logs HTTP request and response details, including headers. For security, the `Authorization` header is redacted in debug logs but is visible in trace logs. For advanced logging, you can use a custom `ILogger` implementation, such as Serilog or NLog. Pass the implementation to the `Configuration` class when instantiating a `Client`. ### [](#basic-example-serilog)Basic example: Serilog Install the packages: ```bash dotnet add package Serilog dotnet add package Serilog.Extensions.Logging dotnet add package Serilog.Sinks.Console dotnet add package Serilog.Sinks.File ``` Configure and use the logger: ```csharp using Fauna; using Microsoft.Extensions.Logging; using Serilog; using static Fauna.Query; Log.Logger = new LoggerConfiguration() .MinimumLevel.Verbose() .WriteTo.Console() .WriteTo.File("log.txt", rollingInterval: RollingInterval.Day, rollOnFileSizeLimit: true) .CreateLogger(); var logFactory = new LoggerFactory().AddSerilog(Log.Logger); var config = new Configuration("mysecret", logger: logFactory.CreateLogger("myapp")); var client = new Client(config); await client.QueryAsync(FQL($"1+1")); // You should see LogLevel.Debug messages in both the Console and the "log{date}.txt" file ``` # Fauna v10 JVM client driver (current) | Version: 1.0.0 | Repository: fauna/fauna-jvm | | --- | --- | --- | --- | Fauna’s JVM client driver lets you run FQL queries from Java and Scala applications. This guide shows how to set up the driver and use it to run FQL queries. ## [](#requirements)Requirements * Java 11 or later ## [](#supported-cloud-runtimes)Supported cloud runtimes * AWS Lambda (See [AWS Lambda connections](#aws-lambda-connections)) ## [](#installation)Installation The driver is available on the [Maven central repository](https://central.sonatype.com/artifact/com.fauna/fauna-jvm). You can add the driver to your Java project using Gradle or Maven. ### [](#gradle)Gradle File `build.gradle`: ```groovy dependencies { ... implementation "com.fauna:fauna-jvm:1.0.0" ... } ``` ### [](#maven)Maven File `fauna-java/pom.xml`: ```xml ... com.fauna fauna-jvm 1.0.0 ... ``` ## [](#api-reference)API reference API reference documentation for the driver is available at [https://fauna.github.io/fauna-jvm/](https://fauna.github.io/fauna-jvm/). ## [](#sample-app)Sample app For a practical example, check out the [Java sample app](https://github.com/fauna/java-sample-app). This sample app is an e-commerce application that uses Spring Boot and the Fauna JVM driver. The source code includes comments highlighting best practices for using the driver and composing FQL queries. ## [](#basic-usage)Basic usage The following application: * Initializes a client instance to connect to Fauna. * Composes a basic FQL query using an FQL string template. * Runs the query using `query()` and `asyncQuery()`. ```java package org.example; import java.util.concurrent.CompletableFuture; import java.util.concurrent.ExecutionException; import com.fauna.client.Fauna; import com.fauna.client.FaunaClient; import com.fauna.exception.FaunaException; import com.fauna.query.builder.Query; import com.fauna.response.QuerySuccess; import com.fauna.types.Page; import static com.fauna.codec.Generic.pageOf; import static com.fauna.query.builder.Query.fql; public class App { // Define class for `Product` documents // in expected results. public static class Product { public String name; public String description; public Integer price; } public static void main(String[] args) { try { // Initialize a default client. // It will get the secret from the $FAUNA_SECRET environment variable. FaunaClient client = Fauna.client(); // Compose a query. Query query = fql(""" Product.sortedByPriceLowToHigh() { name, description, price } """); // Run the query synchronously. System.out.println("Running synchronous query:"); runSynchronousQuery(client, query); // Run the query asynchronously. System.out.println("\nRunning asynchronous query:"); runAsynchronousQuery(client, query); } catch (FaunaException e) { System.err.println("Fauna error occurred: " + e.getMessage()); e.printStackTrace(); } } private static void runSynchronousQuery(FaunaClient client, Query query) throws FaunaException { // Use `query()` to run a synchronous query. // Synchronous queries block the current thread until the query completes. // Accepts the query, expected result class, and a nullable set of query options. QuerySuccess> result = client.query(query, pageOf(Product.class)); printResults(result.getData()); } private static void runAsynchronousQuery(FaunaClient client, Query query) throws ExecutionException, InterruptedException { // Use `asyncQuery()` to run an asynchronous, non-blocking query. // Accepts the query, expected result class, and a nullable set of query options. CompletableFuture>> futureResult = client.asyncQuery(query, pageOf(Product.class)); QuerySuccess> result = futureResult.get(); printResults(result.getData()); } // Iterate through the products in the page. private static void printResults(Page page) { for (Product product : page.getData()) { System.out.println("Name: " + product.name); System.out.println("Description: " + product.description); System.out.println("Price: " + product.price); System.out.println("--------"); } // Print the `after` cursor to paginate through results. System.out.println("After: " + page.getAfter()); } } ``` ## [](#connect-to-fauna)Connect to Fauna Each Fauna query is an independently authenticated request to the Core HTTP API’s [Query endpoint](../../../reference/http/reference/core-api/#operation/query). You authenticate with Fauna using an [authentication secret](../../../learn/security/authentication/#secrets). ### [](#get-an-authentication-secret)Get an authentication secret Fauna supports several [secret types](../../../learn/security/authentication/#secret-types). For testing, you can create a [key](../../../learn/security/keys/), which is a type of secret: 1. Log in to the [Fauna Dashboard](https://dashboard.fauna.com/). 2. On the **Explorer** page, create a database. 3. In the database’s **Keys** tab, click **Create Key**. 4. Choose a **Role** of **server**. 5. Click **Save**. 6. Copy the **Key Secret**. The secret is scoped to the database. ### [](#initialize-a-client)Initialize a client To send query requests to Fauna, initialize a `FaunaClient` instance with a Fauna authentication secret. You can pass the secret in a `FaunaConfig` object: ```java FaunaConfig config = FaunaConfig.builder().secret("FAUNA_SECRET").build(); FaunaClient client = Fauna.client(config); ``` For supported properties, see [FaunaConfig.Builder](https://fauna.github.io/fauna-jvm/latest/com/fauna/client/FaunaConfig.Builder.html#%3Cinit%3E\(\)) in the API reference. #### [](#use-an-environment-variable)Use an environment variable If not specified, `secret` defaults to the `FAUNA_SECRET` environment variable. For example: ```java // Defaults to the secret in the `FAUNA_SECRET` env var. FaunaClient client = Fauna.client(); ``` #### [](#connect-locally)Connect locally The client comes with a helper config for connecting to Fauna running locally. ```java // Connects to Fauna running locally via Docker (http://localhost:8443 and secret "secret"). FaunaClient local = Fauna.local(); ``` #### [](#scoped-client)Scoped client You can scope a client to a specific database and role. Scoped clients require a [key secret](../../../learn/security/keys/) with the built-in `admin` role. The driver uses this key to create a [scoped key](../../../learn/security/keys/#scoped-keys) internally. ```java FaunaClient db1 = Fauna.scoped(client, FaunaScope.builder("Database1").build()); FaunaScope scope2 = FaunaScope.builder("Database2").withRole(FaunaRole.named("MyRole")).build(); FaunaClient db2 = Fauna.scoped(client, scope2); ``` ### [](#multiple-connections)Multiple connections You can use a single client instance to run multiple asynchronous queries at once. The driver manages HTTP connections as needed. Your app doesn’t need to implement connection pools or other connection management strategies. You can create multiple client instances to connect to Fauna using different secrets or client configurations. ### [](#aws-lambda-connections)AWS Lambda connections AWS Lambda freezes, thaws, and reuses execution environments for Lambda functions. See [Lambda execution environment](https://docs.aws.amazon.com/lambda/latest/dg/running-lambda-code.html). When an execution environment is thawed, Lambda only runs the function’s handler code. Objects declared outside of the handler method remain initialized from before the freeze. Lambda doesn’t re-run initialization code outside the handler. Fauna drivers keep socket connections that can time out during long freezes, causing `ECONNRESET` errors when thawed. To prevent timeouts, create Fauna client connections inside function handlers. Fauna drivers use lightweight HTTP connections. You can create new connections for each request while maintaining good performance. ## [](#run-fql-queries)Run FQL queries Use `fql` string templates to compose FQL queries. To run the query, pass the template and an expected result class to `query()` or `asyncQuery()`: ```java Query query = fql("Product.sortedByPriceLowToHigh()"); QuerySuccess> result = client.query(query, pageOf(Product.class)); ``` You can also pass a nullable set of [query options](#query-opts) to `query()` or `asyncQuery()`. These options control how the query runs in Fauna. See [Query options](#query-opts). You can only compose FQL queries using string templates. ### [](#define-a-custom-class-for-your-data)Define a custom class for your data Use annotations to map a Java class to a Fauna document or object shape: ```java import com.fauna.annotation.FaunaField; import com.fauna.annotation.FaunaId; class Person { @FaunaId private String id; private String firstName; @FaunaField( name = "dob") private String dateOfBirth; } ``` You can use the `com.fauna.annotation` package to modify encoding and decoding of specific fields in classes used as arguments and results of queries: * `@FaunaId`: Should only be used once per class and be associated with a field named `id` that represents the Fauna document ID. It’s not encoded unless the `isClientGenerated` flag is `true`. * `@FaunaTs`: Should only be used once per class and be associated with a field named `ts` that represents the timestamp of a document. It’s not encoded. * `@FaunaColl`: Typically goes unmodeled. Should only be used once per class and be associated with a field named `coll` that represents the collection field of a document. It will never be encoded. * `@FaunaField`: Can be associated with any field to override its name in Fauna. * `@FaunaIgnore`: Can be used to ignore fields during encoding and decoding. Use classes in the `com.fauna.codec` package to handle type erasure when the top-level result of a query is a generic, including: * `PageOf` where `T` is the element type. * `ListOf` where `T` is the element type. * `MapOf` where `T` is the value type. * `OptionalOf` where `T` is the value type. * `NullableDocumentOf` where `T` is the value type. This is specifically for cases when you return a Fauna document that may be null and want to receive a concrete `NullDocument` or `NonNullDocument` instead of catching a `NullDocumentException`. ### [](#var)Variable interpolation Use `${}` to pass native Java variables to FQL. You can escape a variable by prepending an additional `$`. ```java // Create a native Java var. var collectionName = "Product"; // Pass the var to an FQL query. Query query = fql(""" let collection = Collection(${collectionName}) collection.sortedByPriceLowToHigh() """, Map.of( "collectionName", collectionName )); ``` The driver encodes interpolated variables to an appropriate [FQL type](../../../reference/fql/types/) and uses the [wire protocol](../../../reference/http/reference/wire-protocol/) to pass the query to the Core HTTP API’s [Query endpoint](../../../reference/http/reference/core-api/#operation/query). This helps prevent injection attacks. ### [](#query-composition)Query composition You can use variable interpolation to pass FQL string templates as query fragments to compose an FQL query: ```java // Create a reusable query fragment. Query product = fql("Product.byName('pizza').first()"); // Prepare arguments for the query. Map queryArgs = Map.of("product", product); // Use the fragment in another FQL query. Query query = fql(""" let product = ${product} product { name, price } """, queryArgs); ``` ## [](#pagination)Pagination Use `paginate()` to asynchronously iterate through Sets that contain more than one page of results. `paginate()` accepts the same [query options](#query-opts) as `query()` and `asyncQuery()`. ```java import com.fauna.client.Fauna; import com.fauna.client.FaunaClient; import com.fauna.client.PageIterator; public class App { public static void main(String[] args) { FaunaClient client = Fauna.client(); // Paginate will make an async request to Fauna. PageIterator iter1 = client.paginate(fql("Product.all()"), Product.class); // Handle each page. `PageIterator` extends the Java Iterator interface. while (iter1.hasNext()) { Page page = iter1.next(); List pageData = page.data(); // Do something with your data. } PageIterator iter2 = client.paginate(fql("Product.all()"), Product.class); // You can use `flatten()` on `PageIterator` to iterate over every // element in a Set. Iterator productIter = iter2.flatten(); List products = new ArrayList<>(); // Iterate over Product elements without worrying about pages. iter2.forEachRemaining((Product p) -> products.add(p)); } } ``` ## [](#query-stats)Query stats Successful query responses and `ServiceException` exceptions include query stats: ```java package org.example; import java.util.concurrent.CompletableFuture; import java.util.concurrent.ExecutionException; import com.fauna.client.Fauna; import com.fauna.client.FaunaClient; import com.fauna.exception.FaunaException; import com.fauna.exception.ServiceException; import com.fauna.query.builder.Query; import static com.fauna.query.builder.Query.fql; import com.fauna.response.QueryResponse; import com.fauna.response.QuerySuccess; public class App { public static void main(String[] args) { try { FaunaClient client = Fauna.client(); Query query = fql("'Hello world'"); CompletableFuture> futureResponse = client.asyncQuery(query, String.class); QueryResponse response = futureResponse.get(); System.out.println(response.getStats().toString()); } catch (FaunaException e) { if (e instanceof ServiceException) { ServiceException serviceException = (ServiceException) e; System.out.println(serviceException.getStats().toString()); } System.out.println(e); } catch (InterruptedException | ExecutionException e) { e.printStackTrace(); } } } ``` ## [](#client-configuration)Client configuration You can pass a `FaunaConfig` object to customize the configuration of a `FaunaClient` instance. ```java FaunaConfig config = new FaunaConfig.Builder() .secret("") .build(); FaunaClient client = Fauna.client(config); ``` For properties, see [FaunaConfig.Builder](https://fauna.github.io/fauna-jvm/latest/com/fauna/client/FaunaConfig.Builder.html) in the API reference. ### [](#environment-variables)Environment variables By default, `secret` and `endpoint` default to the respective `FAUNA_SECRET` and `FAUNA_ENDPOINT` environment variables. For example, if you set the following environment variables: ```bash export FAUNA_SECRET=FAUNA_SECRET export FAUNA_ENDPOINT=https://db.fauna.com/ ``` You can initialize the client with a default configuration: ```java FaunaClient client = Fauna.client(); ``` ### [](#retries)Retries The client automatically retries queries that receive a response with 429 HTTP status code. The client will retry a query up to 4 times, including the original query request. Retries use an exponential backoff. ## [](#query-opts)Query options You can pass a `QueryOptions` object to `query()` or `asyncQuery()` to control how a query runs in Fauna. You can also use query options to instrument a query for monitoring and debugging. ```java Query query = Query.fql("Hello World"); QueryOptions options = QueryOptions.builder() .linearized(true) .queryTags(Map.of("tag", "value")) .timeout(Duration.ofSeconds(10)) .traceParent("00-750efa5fb6a131eb2cf4db39f28366cb-000000000000000b-00") .typeCheck(false) .build(); QuerySuccess result = client.query(query, String.class, options); ``` For properties, see [QueryOptions.Builder](https://fauna.github.io/fauna-jvm/latest/com/fauna/query/QueryOptions.Builder.html) in the API reference. ## [](#event-feeds)Event feeds The driver supports [event feeds](../../../learn/cdc/). An event feed asynchronously polls an [event source](../../../learn/cdc/#create-an-event-source) for paginated events. To use event feeds, you must have a Pro or Enterprise plan. ### [](#request-an-event-feed)Request an event feed To get an event source, append [`set.eventSource()`](../../../reference/fql-api/set/eventsource/) or [`set.eventsOn()`](../../../reference/fql-api/set/eventson/) to a [supported Set](../../../learn/cdc/#sets). To get an event feed, you can use one of the following methods: * `feed()`: Synchronously fetches an event feed and returns a `FeedIterator` that you can use to iterate through the pages of events. * `asyncFeed()`: Asynchronously fetches an event feed and returns a `CompletableFuture` that you can use to iterate through the pages of events. * `poll()`: Asynchronously fetches a single page of events from the event feed and returns a `CompletableFuture` that you can use to handle each page individually. You can repeatedly call `poll()` to get successive pages. You can use `flatten()` on a `FeedIterator` to iterate through events rather than pages. ```java import com.fauna.client.Fauna; import com.fauna.client.FaunaClient; import com.fauna.event.FeedIterator; import com.fauna.event.EventSource; import com.fauna.event.FeedOptions; import com.fauna.event.FeedPage; import com.fauna.event.EventSource; import com.fauna.response.QuerySuccess; import com.fauna.event.FaunaEvent; import java.util.List; import java.util.ArrayList; import java.util.Iterator; import java.util.concurrent.CompletableFuture; import static com.fauna.query.builder.Query.fql; // Import the Product class for event data. import org.example.Product; public class EventFeedExample { private static void printEventDetails(FaunaEvent event) { System.out.println("Event Details:"); System.out.println(" Type: " + event.getType()); System.out.println(" Cursor: " + event.getCursor()); event.getTimestamp().ifPresent(ts -> System.out.println(" Timestamp: " + ts) ); event.getData().ifPresent(product -> System.out.println(" Product: " + product.toString()) ); if (event.getStats() != null) { System.out.println(" Stats: " + event.getStats()); } if (event.getError() != null) { System.out.println(" Error: " + event.getError()); } System.out.println("-------------------"); } public static void main(String[] args) { FaunaClient client = Fauna.client(); long tenMinutesAgo = System.currentTimeMillis() * 1000 - (10 * 60 * 1000 * 1000); FeedOptions options = FeedOptions.builder() .startTs(tenMinutesAgo) .pageSize(10) .build(); // Example 1: Using `feed()` FeedIterator syncIterator = client.feed( fql("Product.all().eventsOn(.price, .stock)"), options, Product.class ); System.out.println("----------------------"); System.out.println("`feed()` results:"); System.out.println("----------------------"); syncIterator.forEachRemaining(page -> { for (FaunaEvent event : page.getEvents()) { printEventDetails(event); } }); // Example 2: Using `asyncFeed()` CompletableFuture> iteratorFuture = client.asyncFeed( fql("Product.all().eventsOn(.price, .stock)"), options, Product.class ); FeedIterator iterator = iteratorFuture.join(); System.out.println("----------------------"); System.out.println("`asyncFeed()` results:"); System.out.println("----------------------"); iterator.forEachRemaining(page -> { for (FaunaEvent event : page.getEvents()) { printEventDetails(event); } }); // Example 3: Using `flatten()` on a `FeedIterator` FeedIterator flattenedIterator = client.feed( fql("Product.all().eventSource()"), options, Product.class ); Iterator> eventIterator = flattenedIterator.flatten(); List> allEvents = new ArrayList<>(); eventIterator.forEachRemaining(allEvents::add); System.out.println("----------------------"); System.out.println("`flatten()` results:"); System.out.println("----------------------"); for (FaunaEvent event : allEvents) { printEventDetails(event); } // Example 4: Using `poll()` QuerySuccess sourceQuery = client.query( fql("Product.all().eventSource()"), EventSource.class ); EventSource source = EventSource.fromResponse(sourceQuery.getData()); CompletableFuture> pageFuture = client.poll( source, options, Product.class ); while (pageFuture != null) { FeedPage page = pageFuture.join(); List> events = page.getEvents(); System.out.println("----------------------"); System.out.println("`poll()` results:"); System.out.println("----------------------"); for (FaunaEvent event : events) { printEventDetails(event); } if (page.hasNext()) { FeedOptions nextPageOptions = options.nextPage(page); pageFuture = client.poll(source, nextPageOptions, Product.class); } else { pageFuture = null; } } } } ``` If you pass an event source directly to `feed()` or `poll()` and changes occur between the creation of the event source and the event feed request, the feed replays and emits any related events. In most cases, you’ll get events after a specific start time or cursor. ### [](#get-events-after-a-specific-start-time)Get events after a specific start time When you first poll an event source using an event feed, you usually include a `startTs` (start timestamp) in the `FeedOptions` passed to `feed()`, `asyncFeed()`, or `poll()`. `startTs` is an integer representing a time in microseconds since the Unix epoch. The request returns events that occurred after the specified timestamp (exclusive). ```java Query query = fql("Product.all().eventsOn(.price, .stock)"); // Calculate the timestamp for 10 minutes ago in microseconds. long tenMinutesAgo = System.currentTimeMillis() * 1000 - (10 * 60 * 1000 * 1000); FeedOptions options = FeedOptions.builder() .startTs(tenMinutesAgo) .pageSize(10) .build(); // Example 1: Using `feed()` FeedIterator syncIterator = client.feed( query, options, Product.class ); // Example 2: Using `asyncFeed()` CompletableFuture> iteratorFuture = client.asyncFeed( query, options, Product.class ); // Example 3: Using `poll()` QuerySuccess sourceQuery = client.query( query, EventSource.class ); EventSource source = EventSource.fromResponse(sourceQuery.getData()); CompletableFuture> pageFuture = client.poll( source, options, Product.class ); ``` ### [](#get-events-after-a-specific-cursor)Get events after a specific cursor After the initial request, you usually get subsequent events using the cursor for the last page or event. To get events after a cursor (exclusive), include the cursor in the `FeedOptions` passed to `feed()`, `asyncFeed()`, or `poll()`: ```java Query query = fql("Product.all().eventsOn(.price, .stock)"); FeedOptions options = FeedOptions.builder() .cursor("gsGabc456") // Cursor for the last page .pageSize(10) .build(); // Example 1: Using `feed()` FeedIterator syncIterator = client.feed( query, options, Product.class ); // Example 2: Using `asyncFeed()` CompletableFuture> iteratorFuture = client.asyncFeed( query, options, Product.class ); // Example 3: Using `poll()` QuerySuccess sourceQuery = client.query( query, EventSource.class ); EventSource source = EventSource.fromResponse(sourceQuery.getData()); CompletableFuture> pageFuture = client.poll( source, options, Product.class ); ``` ### [](#error-handling)Error handling Exceptions can be raised in two different places: * While fetching a page * While iterating a page’s events This distinction lets ignore errors originating from event processing. For example: ```java try { FeedIterator syncIterator = client.feed( fql("Product.all().map(.details.toUpperCase()).eventSource()"), options, Product.class ); syncIterator.forEachRemaining(page -> { try { for (FaunaEvent event : page.getEvents()) { // Event-specific handling System.out.println("Event: " + event); } } catch (FaunaException e) { // Handle errors for specific events within the page System.err.println("Error processing event: " + e.getMessage()); } }); } catch (FaunaException e) { // Additional handling for initialization errors System.err.println("Error occurred with event feed initialization: " + e.getMessage()); } ``` ## [](#event-streaming)Event streams The driver supports [event streams](../../../learn/cdc/). To get an event source, append [`set.eventSource()`](../../../reference/fql-api/set/eventsource/) or [`set.eventsOn()`](../../../reference/fql-api/set/eventson/) to a [supported Set](../../../learn/cdc/#sets). To start and subscribe to the stream, pass an `EventSource` and related `StreamOptions` to `stream()` or `asyncStream()`: ```java // Get an event source. Query query = fql("Product.all().eventSource() { name, stock }"); QuerySuccess tokenResponse = client.query(query, EventSource.class); EventSource eventSource = EventSource.fromResponse(querySuccess.getData()); // Calculate the timestamp for 10 minutes ago in microseconds. long tenMinutesAgo = System.currentTimeMillis() * 1000 - (10 * 60 * 1000 * 1000); StreamOptions streamOptions = StreamOptions.builder().startTimestamp(tenMinutesAgo).build(); // Example 1: Using `stream()` FaunaStream stream = client.stream(eventSource, streamOptions, Product.class); // Example 2: Using `asyncStream()` CompletableFuture> futureStream = client.asyncStream(source, streamOptions, Product.class); ``` If changes occur between the creation of the event source and the stream request, the stream replays and emits any related events. Alternatively, you can pass an FQL query that returns an event source to `stream()` or `asyncStream()`: ```java Query query = fql("Product.all().eventSource() { name, stock }"); // Example 1: Using `stream()` FaunaStream stream = client.stream(query, Product.class); // Example 2: Using `asyncStream()` CompletableFuture> futureStream = client.asyncStream(query, Product.class); ``` ### [](#create-a-subscriber-class)Create a subscriber class The methods return a `FaunaStream` publisher that lets you handle events as they arrive. Create a class with the `Flow.Subscriber` interface to process events: ```java package org.example; import java.util.concurrent.CountDownLatch; import java.util.concurrent.Flow; import java.util.concurrent.atomic.AtomicInteger; import com.fauna.client.Fauna; import com.fauna.client.FaunaClient; import com.fauna.event.FaunaEvent; import com.fauna.event.FaunaStream; import com.fauna.exception.FaunaException; import static com.fauna.query.builder.Query.fql; // Import the Product class for event data. import org.example.Product; public class EventStreamExample { public static void main(String[] args) throws InterruptedException { try { FaunaClient client = Fauna.client(); // Create a stream of all products. Project the name and stock. FaunaStream stream = client.stream(fql("Product.all().eventSource() { name, stock }"), Product.class); // Create a subscriber to handle stream events. ProductSubscriber subscriber = new ProductSubscriber(); stream.subscribe(subscriber); // Wait for the subscriber to complete. subscriber.awaitCompletion(); } catch (FaunaException e) { System.err.println("Fauna error occurred: " + e.getMessage()); e.printStackTrace(); } catch (InterruptedException e) { e.printStackTrace(); } } static class ProductSubscriber implements Flow.Subscriber> { private final AtomicInteger eventCount = new AtomicInteger(0); private Flow.Subscription subscription; private final int maxEvents; private final CountDownLatch completionLatch = new CountDownLatch(1); public ProductSubscriber() { // Stream closes after 3 events. this.maxEvents = 3; } @Override public void onSubscribe(Flow.Subscription subscription) { this.subscription = subscription; subscription.request(1); } @Override public void onNext(FaunaEvent event) { // Handle each event... int count = eventCount.incrementAndGet(); System.out.println("Received event " + count + ":"); System.out.println(" Type: " + event.getType()); System.out.println(" Cursor: " + event.getCursor()); System.out.println(" Timestamp: " + event.getTimestamp()); System.out.println(" Data: " + event.getData().orElse(null)); if (count >= maxEvents) { System.out.println("Closing stream after " + maxEvents + " events"); subscription.cancel(); completionLatch.countDown(); } else { subscription.request(1); } } @Override public void onError(Throwable throwable) { System.err.println("Error in stream: " + throwable.getMessage()); completionLatch.countDown(); } @Override public void onComplete() { System.out.println("Stream completed."); completionLatch.countDown(); } public int getEventCount() { return eventCount.get(); } public void awaitCompletion() throws InterruptedException { completionLatch.await(); } } } ``` ## [](#debug-logging)Debug logging To log the driver’s HTTP requests and responses, set the `FAUNA_DEBUG` environment variable to `1`. The driver outputs requests and responses, including headers, to `stderr`. You can also use your logger. Setting `Level.WARNING` is equivalent to `FAUNA_DEBUG=0`. Setting `Level.FINE` is equivalent to `FAUNA_DEBUG=1`. The driver logs HTTP request bodies at `Level.FINEST`. ```java import java.util.logging.ConsoleHandler; import java.util.logging.Handler; import java.util.logging.Level; import java.util.logging.SimpleFormatter; import com.fauna.client.Fauna; import com.fauna.client.FaunaClient; class App { public static void main(String[] args) { Handler handler = new ConsoleHandler(); handler.setLevel(Level.FINEST); handler.setFormatter(new SimpleFormatter()); FaunaClient client = Fauna.client(FaunaConfig.builder().logHandler(handler).build()); } } ``` # Fauna CLI v4 | Version: 4.0.0 | Package: fauna-shell | | --- | --- | --- | --- | The Fauna CLI lets you access Fauna from your terminal. You can use the CLI to: * Create and manage Fauna [databases](../../../learn/data-model/databases/). * Manage [database schema](../../../learn/schema/) as `.fsl` schema files. * Run [FQL queries](../../../learn/query/) from files or in an interactive REPL. * Run a local [Fauna container](../../tools/docker/). ## [](#requirements)Requirements * [Node.js](https://nodejs.org/en/download/package-manager) v20.18 or later. * [Node.js](https://nodejs.org/en/download/package-manager) v22 or later is recommended. * A Fauna account. ## [](#quick-start)Quick start To get started: 1. **Install the CLI** ```bash npm install -g fauna-shell ``` 2. **Enable auto-complete (for bash or zsh)** Append the output of [`fauna completion`](commands/completion/) to your `.bashrc`, `.bash_profile`, `.zshrc`, or `.zprofile`. For example: ```bash fauna completion >> ~/.zshrc ``` 3. **Authenticate with Fauna** ```cli fauna login ``` 4. **Create a config file (optional)** In your project’s directory, create a `fauna.config.yaml` file. The file contains settings for different profiles. Each setting is passed as a flag when executing CLI commands. ```yaml # file: fauna.config.yaml # `dev` profile settings dev: color: true # Passed as '--color' in CLI commands database: us/my_db # Passed as '--database us/mydb' in CLI commands dir: /schema # Passed as '--dir /schema' in CLI commands role: admin # Passed as '--role admin' in CLI commands ``` 5. **Run CLI commands** Run a command using the settings from the `dev` profile in your config file: ```cli # Runs a query using the settings # from the config file's 'dev' profile. fauna query "Collection.all()" \ --profile dev ``` You can specify flags to override settings from a config file profile. For example, to run a query in a different database: ```cli # Runs a query in the 'parent_db/child_db' child database # in the 'us' region group, overriding the profile's database setting. fauna query "Collection.all()" \ --database us/parent_db/child_db \ --profile dev ``` Or run commands without using a config file by providing flags directly: ```cli # Runs a query in the top-level 'my_db' database # in the 'us' region group. Uses the default admin role. fauna query "Collection.all()" \ --database us/my_db ``` ## [](#install)Installation To install the Fauna CLI globally: ```bash npm install -g fauna-shell ``` ## [](#autocomplete)Auto-complete To enable auto-complete for CLI commands in bash or zsh, run [`fauna completion`](commands/completion/) and append the output to your `.bashrc`, `.bash_profile`, `.zshrc`, or `.zprofile`. For example: ```bash fauna completion >> ~/.zshrc ``` ## [](#auth)Authentication Most commands make requests to the [Fauna Core HTTP API](../../../reference/http/reference/core-api/), which requires authentication. Commands support the following authentication methods: * [Interactive login](#interactive) * [Account keys](#account-key) * [Secrets](#secret) * [Local Fauna container](#local) ### [](#interactive)Interactive login For interactive use cases, use the login flow: 1. Run [`fauna login`](commands/login/) and specify an optional `--user`: ```cli # Log in as the 'john_doe' user. fauna login \ --user john_doe ``` The `--user` argument can be any string. `--user` is only used by the CLI to store and retrieve Fauna [account keys](#reference:http/reference/account-api.adoc#section/Authentication). The `--user` is not passed to Fauna itself. See [How interactive login works](#works). 2. After logging in, authenticate other CLI commands by specifying a `--database` with an optional `--user` and `--role`: ```cli # Run a query as the `john_doe` user in the 'us/my_db' database. # Use the 'server' role. fauna query "Collection.all()" \ --database us/my_db \ --user john_doe \ --role server ``` If omitted: * `--user` defaults to [`default`](#default-user) * `--role` defaults to `admin` ```cli # Run a query in the 'us/my_db' database as # the 'default' CLI user. Use the default 'admin' role. fauna query "Collection.all()" \ --database us/my_db ``` #### [](#no-redirect)Log in without redirects By default, [`fauna login`](commands/login/) runs and redirects to a local HTTP callback server at `127.0.0.1` to receive an authentication code from Fauna. This approach may not work in environments where opening a browser isn’t possible or environments with an isolated network stack, such as [Windows Subsystem for Linux (WSL)](https://learn.microsoft.com/en-us/windows/wsl/about) or other virtualized environments. In these cases, use `--no-redirect` to manually generate and provide an authentication code without redirects and a local callback server. ```cli # Log in as the 'john_doe' user # without a a local callback server. fauna login \ --no-redirect \ --user john_doe ``` #### [](#works)How interactive login works Upon [login](#interactive), the CLI stores a Fauna [account key](#reference:http/reference/account-api.adoc#section/Authentication) and refresh token for the `--user` in the `access_keys` file located at: * Linux, macOS, Unix: `~/.fauna/credentials/access_keys` * Windows: `%userprofile%\.fauna\credentials\access_keys` The CLI automatically refreshes keys in the `access_keys` file. ```json { "default": { "accountKey": "fnacapi_...", "refreshToken": "fnart_..." }, "john_doe": { "accountKey": "fnacapi_...", "refreshToken": "fnart_..." } } ``` The CLI uses the account key to create a short-lived [scoped key](../../../learn/security/keys/#scoped-keys) for the `--database` and `--role`. The CLI uses the scoped key to authenticate [Fauna Core HTTP API](../../../reference/http/reference/core-api/) requests for the command. The scoped key’s secret is stored under the user’s account key in the `secret_keys` file located at: * Linux, macOS, Unix: `~/.fauna/credentials/secret_keys` * Windows: `%userprofile%\.fauna\credentials\secret_keys` ```json { "fnacapi_...": { "us/my_db:admin": { "secret": "fn...", "expiresAt": 1733322000905 } } } ``` CLI-created scoped keys have a 15-minute [`ttl` (time-to-live)](../../../learn/security/keys/#ttl) and are scoped to their specific database. #### [](#default-user)Default user The CLI automatically uses the `default` user if you don’t specify a `--user` when: * Running [`fauna login`](commands/login/) * Running other CLI commands with `--database` but no [`--account-key`](#account-key) The `default` user is associated with credentials stored under the `default` key in the `access_keys` file. For example, the following command: ```cli fauna query "Collection.all()" \ --database us/my_db ``` Is equivalent to: ```cli fauna query "Collection.all()" \ --database us/my_db \ --user default ``` ### [](#account-key)Account keys You can specify a Fauna [account key](#reference:http/reference/account-api.adoc#section/Authentication) to authenticate CLI commands. The CLI uses the first account key found based on the following order of precedence: * The [`--account-key`](#account-key-flag) flag * The [`FAUNA_ACCOUNT_KEY`](#account-key-env-var) environment variable * Account keys in the `access_keys` file. These keys are typically created using an [interactive login](#interactive). For more information, see [How account key authentication works](#account-key-works). #### [](#account-key-flag)`--account-key` flag Use `--account-key` to provide an account key for a command. You must also provide a `--database` and an optional `--role`: ```cli # Run a query using an account key. fauna query "Collection.all()" \ --account-key $MY_ACCOUNT_KEY \ --database us/my_db \ --role server ``` If omitted, `--role` defaults to `admin`. You can’t use `--account-key` with `--user` or `--secret`. If both `--account-key` and `--user` are specified, `--user` is ignored. If both `--account-key` and `--secret` are specified, the command returns an error. #### [](#account-key-env-var)`FAUNA_ACCOUNT_KEY` environment variable You can specify a default account key using the `FAUNA_ACCOUNT_KEY` environment variable: ```bash export FAUNA_ACCOUNT_KEY="my_account_key" ``` #### [](#account-key-works)How account key authentication works Account key authentication works similarly to an [interactive login](#works). The CLI uses the account key to create a short-lived [scoped key](../../../learn/security/keys/#scoped-keys) for the `--database` and `--role`. The CLI uses the scoped key to authenticate [Fauna Core HTTP API](../../../reference/http/reference/core-api/) requests for the command. The scoped key’s secret is stored under the account key in the `secret_keys` file. For more information about this file, see [How interactive login works](#works). Unlike an [interactive login](#works), the user-provided account key isn’t stored in the `access_keys` file. User-provided account keys aren’t automatically refreshed. ### [](#secret)Secrets You can specify an [authentication secret](../../../learn/security/authentication/) for CLI commands. The CLI uses the first secret found based on the following order of precedence: * The [`--secret`](#secret-flag) flag * The [`FAUNA_SECRET`](#secret-env-var) environment variable * Secrets in the `secret_keys` file. These keys are typically created by the CLI using a Fauna account key. See [How interactive login works](#works) and [How account key authentication works](#account-key-works). #### [](#secret-flag)`--secret` flag Use `--secret` to directly provide a [database authentication secret](../../../learn/security/authentication/): ```cli # Run a query in the database scoped to a # secret. fauna query "Collection.all()" \ --secret $MY_SECRET ``` The command runs in the database to which the secret is scoped. #### [](#scoped-keys)Scoped keys If the `--secret` is a [key secret](../../../learn/security/keys/) with the `admin` role, you can use `--database` and `--role` to create and use a [scoped key](../../../learn/security/keys/#scoped-keys) that impersonates a [role](../../../learn/security/roles/) on a child database. If impersonating a user-defined role, the role must be defined in the child database. ```cli # Create and use a scoped key that impersonates a secret with the # 'server' role in the 'child_db' child database. The # query runs in the 'child_db' child database. fauna query "Collection.all()" \ --secret $MY_SECRET \ --database child_db --role server ``` Alternatively, you can manually pass a [scoped key](../../../learn/security/keys/#scoped-keys) in `--secret` and omit `--database` and `--role`: ```cli # Use a scoped key that impersonates a secret with the # 'server' role in the 'child_db' child database. The # query runs in the 'child_db' child database. fauna query "Collection.all()" \ --secret $MY_SECRET:child_db:server ``` #### [](#secret-env-var)`FAUNA_SECRET` environment variable You can specify a default secret using the `FAUNA_SECRET` environment variable: ```bash export FAUNA_SECRET="my_secret" ``` ### [](#local)Local Fauna container If you have [Docker](https://www.docker.com/) or a similar software installed and running, you can use [`fauna local`](commands/local/) to start a local [Fauna container](../../tools/docker/): ```cli # Starts a local Fauna container. # The container's name defaults to 'faunadb'. fauna local ``` Use `--database` to optionally create a database in the container. ```cli # Start a local Fauna container. # Create the 'my_db' database in the container. fauna local \ --database my_db ``` When creating a database, use `--dir` to push a local directory of `.fsl` schema files to the database using [`fauna schema push`](commands/schema/push/). The schema is immediately applied to the database’s active schema with no prompts. ```cli # Create the 'my_db' database in a container. # Immediately apply changes to the 'my_db' database's # active schema. fauna local \ --database my_db \ --dir /path/to/schema/dir ``` Once started, use `--local` to run commands in the container. When specifying a `--database`, omit the region group: ```cli # Run a query in the 'my_db' database of a local # Fauna container. Use the default admin role. fauna query "Collection.all()" \ --database my_db \ --local ``` The `--local` flag sets: * `--url` to [http://0.0.0.0:8443](http://0.0.0.0:8443). * `--secret` to `secret`, the default secret for the [top-level key](../../tools/docker/#secret) of Fauna container instances. The secret uses the admin role. You can override these arguments using the respective `--url` and `--secret` flags. #### [](#local-scoped-keys)Scoped keys for local containers When used with `--local`, the\`--database\` and optional `--role` flags let you create and use a [scoped key](../../../learn/security/keys/#scoped-keys). The scoped key impersonates a [role](../../../learn/security/roles/) on a database. If impersonating a user-defined role, the role must be defined in the database. ```cli # Creates a scoped key that impersonates a secret with the # 'server' role in the 'parent_db/child_db' child database. fauna query "Collection.all()" \ --database parent_db/child_db \ --role server \ --local ``` If needed, you can explicitly provide a `--secret`. The secret must match the [top-level key](../../tools/docker/#secret) for the Fauna container: ```cli # Equivalent to the previous command. # Provides an explicit top-level key # for the local Fauna container. fauna query "Collection.all()" \ --database parent_db/child_db \ --role server \ --local \ --secret $MY_SECRET ``` ## [](#config)Configuration The CLI lets you pass settings using a YAML (`.yml`, `.yaml`) or JSON (`.json`) config file. Each setting is passed as a flag when executing CLI commands. The config file is organized into profiles, letting you use different groups of settings for different environments or use cases. ```yaml # `dev` profile settings dev: color: true # Passed as '--color' in CLI commands database: us/my_db # Passed as '--database us/mydb' in CLI commands dir: /schema # Passed as '--dir /schema' in CLI commands # `ci` profile settings ci: local: true # Passed as '--local' in CLI commands color: false # Passed as '--color=false' in CLI commands database: my_db # Passed as '--database mydb' in CLI commands dir: /schema # Passed as '--dir /schema' in CLI commands input: false # Passed as '--input=false' in CLI commands json: true # Passed as '--json' in CLI commands ``` ### [](#settings)Settings Each setting in the config file maps directly to a command flag. For example, the `database` setting in the config file is passed as the `--database` flag when running a command. Supported flags vary by command. Not all commands support the same flags. Refer to [each command’s documentation](commands/) to see supported flags. Unsupported or unrecognized settings/flags do not trigger errors. #### [](#common-settings)Common settings The following table outlines common settings for the config file. The table is not exhaustive. | Setting | Type | Description | | --- | --- | --- | --- | --- | | account-key | string | Fauna account key used for authentication. If used, you must also provide a database and an optional role setting/flag.We only recommend using this setting in CI/CD and automated workflows. See Storing credentials in config files. | | color | Boolean | Enable color formatting for output. Enabled by default. Use no-color: false to disable. | | database | string | Database, including the region group identifier and hierarchy, to run the command in. Supports shorthand region group identifiers. Separate path components using /. Examples: us/my_db, eu/parent_db/child_db, global/db. Can’t be used with secret.If using a local Fauna container, omit the region group. | | dir | string | Path to a local directory containing .fsl files for the database. Used by fauna schema commands.Recursively scans subdirectories. Defaults to the current directory (.). | | input | Boolean | Enable user prompts. Defaults to true. To disable prompts, use input: false. | | json | Boolean | For supported commands, output results as JSON. Doesn’t affect error output. | | local | Boolean | Use a local Fauna container. If not otherwise specified, this settings configures:url to http://0.0.0.0:8443secret to secret | | quiet | Boolean | Suppress all log messages except fatal errors. Output only command results. Overrides verbosity and verbose-component settings/flags used for Debug logging. | | secret | string | Secret used for authentication. Supports scoped keys. The command runs in the database to which the secret is scoped. Can’t be used with database.We only recommend using this setting in CI/CD and automated workflows. See Storing credentials in config files. | | role | string | Role used to run commands. Defaults to admin. Can’t be used with --secret. | | timeout | number (integer) | Maximum query runtime, in milliseconds. Used by the fauna query and fauna shell commands. | #### [](#config-creds)Storing credentials in config files While supported, we don’t recommend using the `account-key` or `secret` settings to store [account keys](#reference:http/reference/account-api.adoc#section/Authentication) or [database authentication secrets](../../../learn/security/authentication/) in interactive development environments. Instead, use [interactive login](#interactive) or the respective [FAUNA\_ACCOUNT\_KEY](#account-key-env-var) or [FAUNA\_SECRET](#secret-env-var) environment variables. For [CI/CD and automated workflows](#script-auth), you can create and store an [account key](#reference:http/reference/account-api.adoc#section/Authentication) or [database authentication secret](../../../learn/security/authentication/) with a short time-to-live (TTL) in the respective `account-key` or `secret` setting of a temporary config file. You should: * Avoid using account keys or secrets with a long time-to-live (TTL). * Ensure config files are not committed to version control. * Delete the config file after use. #### [](#setting-precedence)Setting precedence Command line flags take precedence over config file settings. If both are provided, the flag value will be used in place of the config file setting. To see what arguments are used to execute a command, use `--verbose-component argv`. See [Component-specific logging](#comp-log). ### [](#provide)Provide a config file The CLI uses the first config file found based on the following order of precedence: * The [`--config`](#config-flag) flag * The [`FAUNA_CONFIG`](#config-env-var) environment variable * A [default config file](#default), automatically detected based on filename If you provide a config file, you must also [specify a profile](#profile) to use. #### [](#config-flag)`--config` flag Use `--config` to provide a path to a config file to use. For example: ```cli # Use the `config.yml` config file. fauna query "2 + 2" \ --database us/my_db \ --config /path/to/config.yml ``` #### [](#config-env-var)`FAUNA_CONFIG` environment variable Use the `FAUNA_CONFIG` environment variable to specify a path to a config file to use. For example: ```bash export FAUNA_CONFIG="/path/to/config.yml" ``` #### [](#default)Default config files The CLI automatically detects config files with the following names in the directory where you run the command: * `fauna.config.yaml` * `fauna.config.yml` * `fauna.config.json` * `.fauna.config.yaml` * `.fauna.config.yml` * `.fauna.config.json` If multiple default config files are found, the CLI returns an error. ### [](#profile)Specify a profile Use `--profile` to specify a profile from the [config file](#config). The profile specifies the group of [settings](#settings) in the config file to use. For example: ```cli # Use settings in the config file's `container` profile. fauna query "2 + 2" \ --database us/my_db \ --profile container ``` #### [](#profile-env-var)`FAUNA_PROFILE` environment variable Use the `FAUNA_PROFILE` environment variable to specify a default profile to use. For example: ```bash export FAUNA_PROFILE="container" ``` ## [](#flag-conventions)Flag conventions In the CLI, you pass arguments to a command using flags. For example: ```cli # Uses the `--database` flag for `fauna shell`. fauna shell \ --database us/my_db ``` Some flags also support a shorthand alias: ```cli # Uses the `-d` alias for `--database`. fauna shell \ -d us/my_db ``` ### [](#array-arguments)Array arguments Some flags, such `--verbose-component`, accept an array of values. To specify an array, you can use a space-separated list: ```cli # Passes `fetch` and `error` to `--verbose-component`. fauna shell \ --database us/my_db \ --verbose-component fetch error ``` Alternatively, you can use separate flags: ```cli # Passes `fetch` and `error` to `--verbose-component`. # Equivalent to the previous query. fauna shell \ --database us/my_db \ --verbose-component fetch \ --verbose-component error ``` ### [](#boolean-arguments)Boolean arguments Some flags, such `--color`, accept a boolean value. If the flag is specified with no value, its value is `true`: ```cli # Passes a `--color` value of `true`. fauna shell \ --database us/my_db \ --color ``` Use the `--[no]-` prefix to pass a value of `false`: ```cli # Passes a `--color` value of `false`. fauna shell \ --database us/my_db \ --no-color ``` You can also use `=` to explicitly specify a boolean value: ```cli # Passes a `--color` value of `true`. fauna shell \ --database us/my_db \ --color=true ``` ### [](#multi-word-flags)Multi-word flags Flags that consist of multiple words support both kebab case and camel case: ```cli # Uses the `----account-key` flag. fauna query "Collection.all()" \ --account-key $MY_ACCOUNT_KEY \ --database us/my_db ``` ```cli # Uses the `--accountKey` alias for `--account-key`. fauna query "Collection.all()" \ --accountKey $MY_ACCOUNT_KEY \ --database us/my_db ``` ## [](#log)Debug logging By default, the CLI disables debug logging. You can enable debug logs using `--verbosity` and `--verbose-component`. The CLI outputs debug logs for warning and errors to `stderr` and other messages to `stdout`. ### [](#verbosity)Global debug log level `--verbosity` accepts an integer representing a debug log level. The level determines the least critical type of message to emit. | --verbosity value | Debug log level | Output stream | | --- | --- | --- | --- | --- | | 5 | Debug. Detailed debug messages. | stdout | | 4 | Info. Informational messages. | stdout | | 3 | Warn. Warnings about potential issues. | stderr | | 2 | Error. Errors that need attention. | stderr | | 1 | Fatal. Critical errors. | stderr | | 0 | Default. Emit no debug logs. | | Debug logs above the `--verbosity` level are not emitted. For example: ```cli # Only emits debug logs for warnings (3), errors (2), # and fatal messages (1). fauna query "Collection.all()" \ --database us/my_db \ --verbosity 3 ``` ### [](#comp-log)Component-specific logging Use `--verbose-component` to emit messages of any level for specific components, regardless of `--verbosity`. Accepted values include: | --verbose-component value | Description | | --- | --- | --- | --- | | argv | Command flags, settings, and environment variables. Messages typically include information about arguments in use and the precedence of arguments. | | config | Config files. Messages typically include information about the config file in use. | | creds | Authentication. Messages typically include information about credentials in use and credential refreshes. | | error | Prints the stack trace of errors instead of the error’s message. | | fetch | HTTP requests, including requests to the Fauna Core HTTP API and Account HTTP API. Authentication secrets are redacted. | You can pass multiple components to `--verbose-component` as a space-separated list. For example: ```cli # Emits all debug log messages for the `argv` and `config` # components, including log messages that are # less critical than warnings (3). fauna query "Collection.all()" \ --database us/my_db \ --verbosity 3 \ --verbose-component argv config \ --config /path/to/config.yml \ --profile dev ``` The command outputs the following debug logs: ```bash [config]: Reading config from /myapp/.fauna.config.yaml. [config]: Using profile dev... [config]: Applying config: { "color": true } [config]: Reading config from /myapp/.fauna.config.yaml. [config]: Using profile dev... [config]: Applying config: { "color": true } [argv]: { "_": [ "query" ], "database": "us/my_db", "d": "us/my_db", "verbosity": 3, "verbose-component": [ "argv", "config" ], "verboseComponent": [ "argv", "config" ], "config": "./.fauna.config.yaml", "profile": "dev", "p": "dev", "color": true, "json": false, "quiet": false, "user": "default", "u": "default", "local": false, "api-version": "10", "v": "10", "apiVersion": "10", "format": "fql", "f": "fql", "timeout": 5000, "performance-hints": false, "performanceHints": false, "include": [ "summary" ], "account-url": "https://account.fauna.com", "accountUrl": "https://account.fauna.com", "$0": "fauna", "fql": "Collection.all()" } [argv]: Existing Fauna environment variables: {} [argv]: Defaulted url to 'https://db.fauna.com' no --url was provided [argv]: no --input specified, using [fql] ... ``` ### [](#suppress-log)Suppress debug logs `--quiet` suppresses all debug log messages except fatal errors. `--quiet` overrides `--verbosity` and `--verbose-component`. You typically use `--quiet` to only output the results of a command. ```cli # Only output the results of the command. fauna query "Collection.all()" \ --database us/my_db \ --quiet ``` ## [](#script)Scripting Scripts and CI/CD workflows can use the CLI to automate tasks in a Fauna database. For example, you can use schema-related Fauna CLI commands to manage schema as `.fsl` files. See [Manage schema with a CI/CD pipeline](../../../learn/schema/manage-schema/#cicd). ### [](#best-practices)Best practices When using the CLI in an automated workflow, follow these best practices. #### [](#use-a-config-file-and-profile)Use a config file and profile Use a [config file](#config) and [profiles](#profile) to pass settings as flags to CLI commands. You can create profiles for different environments and use cases. You can switch between profiles using the `--profile` flag or the [`FAUNA_PROFILE`](#config-env-var) environment variable. You can override a profile’s settings by explicitly passing a flag to a command. #### [](#script-auth)Authenticate using an account key or database secret The CLI supports several [authentication methods](#auth). For CI/CD and other automated workflows, we recommend you do one of the following: * Use the [`FAUNA_ACCOUNT_KEY`](#account-key-flag) or [`FAUNA_SECRET`](#secret-env-var) environment variables. * Create a temporary config file using your script or CI tool. In the config file, create and store an [account key](#reference:http/reference/account-api.adoc#section/Authentication) or [database authentication secret](../../../learn/security/authentication/) with a short time-to-live (TTL) in the respective `account-key` or `secret` setting. Delete the config file after use. For example: ```yaml default: account-key: fnacapi_abc123 ``` #### [](#avoid-interactive-commands)Avoid interactive commands Avoid using commands, [`fauna shell`](commands/shell/) or [`fauna login`](commands/login/), that require user input. #### [](#disable-interactive-prompts)Disable interactive prompts Use `--input=false` or the corresponding [setting](#settings) to disable prompts for commands such as [`fauna schema push`](commands/schema/push/) or [`fauna schema commit`](commands/schema/commit/). To set `--input=false` as a setting in a config file: ```yaml # `ci` profile settings ci: input: false ``` #### [](#use-json-output-for-parsing)Use JSON output for parsing If you use a JSON parser, such as [jq](https://jqlang.github.io/jq/), enable JSON output for compatible commands by specifying `--json=true` or the corresponding `json` [setting](#settings). To set `--json=true` as a setting in a config file: ```yaml # `ci` profile settings ci: json: true ``` #### [](#set-timeouts-for-queries)Set timeouts for queries Use `--timeout` or the corresponding [setting](#settings) to set a maximum runtime, in milliseconds, for query requests made by [`fauna query`](commands/query/). To set `--timeout` as a setting in a config file: ```yaml # `ci` profile settings ci: timeout: 3000 ``` #### [](#customize-logging-as-needed)Customize logging as needed Use `--verbosity`. `--verbose-component`, and `--quiet` to customize logging for your use case. If needed, you can redirect `stderr` and `stdout` based on your environment. See [Debug logging](#log). ## [](#migrate)Migrate from v3 of the CLI [v3](../) of the Fauna CLI is now deprecated. [v4](./) of the Fauna CLI introduces several significant enhancements to the developer experience. The following table outlines major changes from v3 to v4 of the CLI. The table is not exhaustive. | Topic | Changes in v4 | | --- | --- | --- | --- | | Requirements | v4 of the CLI requires Node.js v20.18 or later. v22 or later is recommended. | | Authentication | fauna login replaces fauna cloud-login. The new fauna login command uses a web-based flow. You can use --user to switch between credentials for different users after login.The .fauna-shell file is no longer used and can be safely deleted. Instead, the CLI uses short-lived credentials and refreshes credentials as needed. See How interactive login works.You can use the FAUNA_ACCOUNT_KEY or FAUNA_SECRET environment variables to authenticate commands programmatically. | | Configuration | The .fauna-shell file is no longer used and can be safely deleted..fauna-project files are no longer used and can be safely deleted.The environment and endpoint abstractions are no longer used. Instead, you can use profiles in a config file to switch between settings for different environments or use cases. | | Local development and scripting | v4 introduces fauna local, which starts a local Fauna container.In v4, you can use --local to run CLI commands in a local Fauna container.v4 supports customized logging and scripting. | | Removed and updated commands | The following database commands have been renamed and updated:fauna create-database is now fauna database create.fauna list-databases is now fauna database list.fauna delete-database is now fauna database delete.The fauna eval command is now fauna query, which supports a fauna eval alias.v4 has removed commands for managing keys. Instead, you can manage keys in the CLI with FQL queries using Key methods in fauna eval or fauna shell.Commands related to managing environments and endpoints have been removed. Instead, you can use profiles in a config file to switch between settings for different environments or use cases.fauna import has been removed. | | Schema management | fauna schema commands no longer require fauna project init. Otherwise, fauna schema commands remain largely unchanged..fauna-project files are no longer used and can be safely deleted. Instead, you can use profiles in a config file to easily switch between databases and other command settings. | | Auto-complete | You can use the fauna completion command to enable auto-complete for CLI v4 commands in bash or zsh. See Auto-complete. | | CLI updates and issue tracking | Starting with v4 of the CLI, Fauna uses the fauna-shell GitHub repository for issue tracking and release tracking.v4 of the CLI uses the update-notifier library to notify you when updates to the CLI’s npm package is available. | # Fauna CLI v4 commands | Globalfauna completionOutput an auto-complete script for CLI commands in bash or zsh.fauna localStart a local Fauna container.fauna loginLog in to Fauna using a web-based browser flow.fauna queryRun a provided FQL query.fauna shellRun FQL queries in an interactive REPL.Databasefauna database createCreate a database.fauna database deleteDelete a database.fauna database listList top-level or child databases.Exportfauna export createStart the export of a database or specified user-defined collections to a specified destination type.fauna export create s3Start an export to an AWS S3 bucket.fauna export getGet an export by its ID.fauna export listList exports.Schemafauna schema abandonAbandon a database’s staged schema.fauna schema commitApply a staged schema to a database.fauna schema diffShow the diff between a database’s local, staged, or active schema.fauna schema pullPull a database’s remote .fsl schema files into a local directory. By default, pulls the database’s staged schema.fauna schema pushPush a local directory of .fsl schema files to a database. By default, stages the schema.fauna schema statusShows a database’s staged schema status. | fauna completion | Output an auto-complete script for CLI commands in bash or zsh. | fauna local | Start a local Fauna container. | fauna login | Log in to Fauna using a web-based browser flow. | fauna query | Run a provided FQL query. | fauna shell | Run FQL queries in an interactive REPL. | fauna database create | Create a database. | fauna database delete | Delete a database. | fauna database list | List top-level or child databases. | fauna export create | Start the export of a database or specified user-defined collections to a specified destination type. | fauna export create s3 | Start an export to an AWS S3 bucket. | fauna export get | Get an export by its ID. | fauna export list | List exports. | fauna schema abandon | Abandon a database’s staged schema. | fauna schema commit | Apply a staged schema to a database. | fauna schema diff | Show the diff between a database’s local, staged, or active schema. | fauna schema pull | Pull a database’s remote .fsl schema files into a local directory. By default, pulls the database’s staged schema. | fauna schema push | Push a local directory of .fsl schema files to a database. By default, stages the schema. | fauna schema status | Shows a database’s staged schema status. | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | fauna completion | Output an auto-complete script for CLI commands in bash or zsh. | | fauna local | Start a local Fauna container. | | fauna login | Log in to Fauna using a web-based browser flow. | | fauna query | Run a provided FQL query. | | fauna shell | Run FQL queries in an interactive REPL. | | fauna database create | Create a database. | | fauna database delete | Delete a database. | | fauna database list | List top-level or child databases. | | fauna export create | Start the export of a database or specified user-defined collections to a specified destination type. | | fauna export create s3 | Start an export to an AWS S3 bucket. | | fauna export get | Get an export by its ID. | | fauna export list | List exports. | | fauna schema abandon | Abandon a database’s staged schema. | | fauna schema commit | Apply a staged schema to a database. | | fauna schema diff | Show the diff between a database’s local, staged, or active schema. | | fauna schema pull | Pull a database’s remote .fsl schema files into a local directory. By default, pulls the database’s staged schema. | | fauna schema push | Push a local directory of .fsl schema files to a database. By default, stages the schema. | | fauna schema status | Shows a database’s staged schema status. | # `fauna completion` ```cli-sig fauna completion ``` Output an auto-complete script for CLI commands in bash or zsh. To enable auto-complete, run [`fauna completion`](./) and append the command’s output to your `.bashrc`, `.bash_profile`, `.zshrc`, or `.zprofile`. ## [](#examples)Examples ```bash fauna completion >> ~/.bashrc fauna completion >> ~/.bash_profile fauna completion >> ~/.zshrc fauna completion >> ~/.zprofile ``` # `fauna database` ```cli-sig fauna database [flags] ``` Use `fauna database` commands to create and manage databases. ## [](#available-commands)Available commands * [`fauna database create`](create/) * [`fauna database delete`](delete/) * [`fauna database list`](list/) ## [](#aliases)Aliases ```cli-sig fauna db ``` # `fauna database create` ```cli-sig fauna database create --name [flags] ``` Create a [database](../../../../../../learn/data-model/databases/). Specify the child database’s name using `--name`. Use `--database` or `--secret` to specify the parent database. If using `--secret`, the parent database is the database to which the secret is scoped. To create a top-level database, specify a region group identifier in `--database`. You can’t create a top-level database using `--secret`. ## [](#flags)Flags Output `--color` Enable color formatting for output. Enabled by default. Use `--no-color` to disable. `--json` Output results as JSON. This flag doesn’t affect error output. `--quiet` Suppress all log messages except fatal errors. Output only command results. Overrides `--verbosity` and `--verbose-component`. Config `--config ` Path to a CLI [config file](../../../#config) to use. If provided, must [specify a profile](../../../#provide). `-p`, `--profile ` Profile from the CLI [config file](../../../#config). A profile is a group of [CLI settings](../../../#settings). Debug `--verbose-component ` Components to emit logs for. Overrides `--verbosity`. Accepts the following values: * `argv` * `config` * `creds` * `error` * `fetch` Pass values as a space-separated list. Example: `--verbose-component argv config`. `--verbosity ` Least critical log level to emit. Accepts integers ranging from `1` (fatal) to `5` (debug). Lower values represent more critical logs. Log messages with a level greater than this value are not logged. API `-u`, `--user ` CLI user to run the command as. You must first log in as the user using [`fauna login`](../../login/). Defaults to [`default`](../../../#default-user). See [Interactive login](../../../#interactive). `--local` Use a local [Fauna container](../../../../../tools/docker/). If not otherwise specified, this flag sets: * `--url` to [http://0.0.0.0:8443](http://0.0.0.0:8443) * `--secret` to `secret` `--url ` URL for [Core HTTP API requests](../../../../../../reference/http/reference/core-api/) made by the command. Defaults to [https://db.fauna.com](https://db.fauna.com). `--secret ` [Secret](../../../../../../learn/security/authentication/) used for authentication. Supports [scoped keys](../../../../../../learn/security/keys/#scoped-keys). The command runs in the database to which the secret is scoped. If the secret is a [key secret](../../../../../../learn/security/keys/) with the `admin` role, you can pass this flag with `--database` and an optional `--role` to create and use a [scoped key](../../../../../../learn/security/keys/#scoped-keys) that impersonates a role on a child database. See [Scoped keys](../../../#scoped-keys). Can’t be used to create a [top-level database](../../../../../../learn/data-model/databases/#child). `--account-key ` Fauna [account key](#reference:http/reference/account-api.adoc#section/Authentication) used for authentication. If used, you must also provide a `--database` and an optional `--role`. See [Account key authentication](../../../#account-key). Can’t be used with `--user` or `--secret`. If `--account-key` and `--user` are specified, `--user` is ignored. `-d`, `--database ` Database, including the region group identifier and hierarchy, to run the command in. Supports shorthand region group identifiers. Separate path components using `/`. Examples: `us/my_db`, `eu/parent_db/child_db`, `global/db`. Can’t be used with `--secret`. If using a local [Fauna container](../../../#local), omit the region group. `-r`, `--role ` [Role](../../../../../../learn/security/roles/) used to run the command. Defaults to [`admin`](../../../../../../learn/security/roles/#built-in-roles). Can’t be used with `--secret`. Options `-h`, `--help` Show help. `--version` Show the Fauna CLI version. `--name ` (Required) Name of the database to create. To create a [child database](../../../../../../learn/data-model/databases/#child), specify the parent database using `--database` or `--secret`. If using `--secret`, the parent database is the database to which the secret is scoped. `--typechecked` Enable [typechecking](../../../../../../learn/query/static-typing/) for the database. Use `--no-typechecked` to disable. Defaults to enabled for top-level databases. Inherits the parent database’s setting for child databases. `--protected` Enable [protected mode](../../../../../../learn/schema/#protected-mode) for the database. `--priority ` User-defined priority for the database. Must be an integer. ## [](#examples)Examples ```cli # Create a top-level 'my_db' database # in the 'us' region group. fauna database create \ --name my_db \ --database us # Create a 'child_db' child database # directly under 'us/parent_db'. fauna database create \ --name child_db \ --database us/parent_db # Create a 'child_db' child database directly # under the database scoped to a secret. fauna database create \ --name child_db \ --secret my-secret # Create a database with typechecking enabled. fauna database create \ --name my_db \ --database us \ --typechecked # Create a database with protected mode enabled. fauna database create \ --name my_db \ --database us \ --protected ``` ## [](#aliases)Aliases ```cli-sig fauna db create ``` # `fauna database delete` ```cli-sig fauna database delete --name [flags] ``` Delete a [database](../../../../../../learn/data-model/databases/). Specify the child database’s name using `--name`. Use `--database` or `--secret` to specify the parent database. If using `--secret`, the parent database is the database to which the secret is scoped. To delete a top-level database, specify a region group identifier in `--database`. You can’t create a top-level database using `--secret`. ## [](#flags)Flags Output `--color` Enable color formatting for output. Enabled by default. Use `--no-color` to disable. `--json` Output results as JSON. This flag doesn’t affect error output. `--quiet` Suppress all log messages except fatal errors. Output only command results. Overrides `--verbosity` and `--verbose-component`. Config `--config ` Path to a CLI [config file](../../../#config) to use. If provided, must [specify a profile](../../../#provide). `-p`, `--profile ` Profile from the CLI [config file](../../../#config). A profile is a group of [CLI settings](../../../#settings). Debug `--verbose-component ` Components to emit logs for. Overrides `--verbosity`. Accepts the following values: * `argv` * `config` * `creds` * `error` * `fetch` Pass values as a space-separated list. Example: `--verbose-component argv config`. `--verbosity ` Least critical log level to emit. Accepts integers ranging from `1` (fatal) to `5` (debug). Lower values represent more critical logs. Log messages with a level greater than this value are not logged. API `-u`, `--user ` CLI user to run the command as. You must first log in as the user using [`fauna login`](../../login/). Defaults to [`default`](../../../#default-user). See [Interactive login](../../../#interactive). `--local` Use a local [Fauna container](../../../../../tools/docker/). If not otherwise specified, this flag sets: * `--url` to [http://0.0.0.0:8443](http://0.0.0.0:8443) * `--secret` to `secret` `--url ` URL for [Core HTTP API requests](../../../../../../reference/http/reference/core-api/) made by the command. Defaults to [https://db.fauna.com](https://db.fauna.com). `--secret ` [Secret](../../../../../../learn/security/authentication/) used for authentication. Supports [scoped keys](../../../../../../learn/security/keys/#scoped-keys). The command runs in the database to which the secret is scoped. If the secret is a [key secret](../../../../../../learn/security/keys/) with the `admin` role, you can pass this flag with `--database` and an optional `--role` to create and use a [scoped key](../../../../../../learn/security/keys/#scoped-keys) that impersonates a role on a child database. See [Scoped keys](../../../#scoped-keys). Can’t be used to delete a [top-level database](../../../../../../learn/data-model/databases/#child). `--account-key ` Fauna [account key](#reference:http/reference/account-api.adoc#section/Authentication) used for authentication. If used, you must also provide a `--database` and an optional `--role`. See [Account key authentication](../../../#account-key). Can’t be used with `--user` or `--secret`. If `--account-key` and `--user` are specified, `--user` is ignored. `-d`, `--database ` Database, including the region group identifier and hierarchy, to run the command in. Supports shorthand region group identifiers. Separate path components using `/`. Examples: `us/my_db`, `eu/parent_db/child_db`, `global/db`. Can’t be used with `--secret`. If using a local [Fauna container](../../../#local), omit the region group. To delete a top-level database, only include the region group identifier. `-r`, `--role ` [Role](../../../../../../learn/security/roles/) used to run the command. Defaults to [`admin`](../../../../../../learn/security/roles/#built-in-roles). Can’t be used with `--secret`. Options `-h`, `--help` Show help. `--version` Show the Fauna CLI version. `--name ` (Required) Name of the database to delete. To delete a [child database](../../../../../../learn/data-model/databases/#child), specify the parent database using `--database` or `--secret`. If using `--secret`, the parent database is the database to which the secret is scoped. ## [](#examples)Examples ```cli # Delete the top-level 'my_db' database # in the 'us' region group. fauna database delete \ --name my_db \ --database us # Delete the 'child_db' database directly # under 'us/parent_db'. fauna database delete \ --name child_db \ --database us/parent_db # Delete a 'child_db' database directly under the # database scoped to a secret. fauna database delete \ --name child_db \ --secret my-secret ``` ## [](#aliases)Aliases ```cli-sig fauna db delete ``` # `fauna database list` ```cli-sig fauna database list ``` List top-level or child [databases](../../../../../../learn/data-model/databases/). To get top-level databases, omit `--database` and `--secret`. To get child databases, use `--database` or `--secret` to specify the parent database. If using `--secret`, the parent database is the database to which the secret is scoped. When using [interactive login](../../../#interactive) or an [account key](../../../#account-key), the command outputs the database’s name and path, Region Group identifier and hierarchy. Paths are not available when using a [secret](../../../#secret). ## [](#flags)Flags Output `--color` Enable color formatting for output. Enabled by default. Use `--no-color` to disable. `--json` Output results as JSON. This flag doesn’t affect error output. `--quiet` Suppress all log messages except fatal errors. Output only command results. Overrides `--verbosity` and `--verbose-component`. Config `--config ` Path to a CLI [config file](../../../#config) to use. If provided, must [specify a profile](../../../#provide). `-p`, `--profile ` Profile from the CLI [config file](../../../#config). A profile is a group of [CLI settings](../../../#settings). Debug `--verbose-component ` Components to emit logs for. Overrides `--verbosity`. Accepts the following values: * `argv` * `config` * `creds` * `error` * `fetch` Pass values as a space-separated list. Example: `--verbose-component argv config`. `--verbosity ` Least critical log level to emit. Accepts integers ranging from `1` (fatal) to `5` (debug). Lower values represent more critical logs. Log messages with a level greater than this value are not logged. API `-u`, `--user ` CLI user to run the command as. You must first log in as the user using [`fauna login`](../../login/). Defaults to [`default`](../../../#default-user). See [Interactive login](../../../#interactive). `--local` Use a local [Fauna container](../../../../../tools/docker/). If not otherwise specified, this flag sets: * `--url` to [http://0.0.0.0:8443](http://0.0.0.0:8443) * `--secret` to `secret` `--url ` URL for [Core HTTP API requests](../../../../../../reference/http/reference/core-api/) made by the command. Defaults to [https://db.fauna.com](https://db.fauna.com). `--secret ` [Secret](../../../../../../learn/security/authentication/) used for authentication. Supports [scoped keys](../../../../../../learn/security/keys/#scoped-keys). The command runs in the database to which the secret is scoped. If the secret is a [key secret](../../../../../../learn/security/keys/) with the `admin` role, you can pass this flag with `--database` and an optional `--role` to create and use a [scoped key](../../../../../../learn/security/keys/#scoped-keys) that impersonates a role on a child database. See [Scoped keys](../../../#scoped-keys). `--account-key ` Fauna [account key](#reference:http/reference/account-api.adoc#section/Authentication) used for authentication. If used, you must also provide a `--database` and an optional `--role`. See [Account key authentication](../../../#account-key). Can’t be used with `--user` or `--secret`. If `--account-key` and `--user` are specified, `--user` is ignored. `-d`, `--database ` Database, including the region group identifier and hierarchy, to run the command in. Supports shorthand region group identifiers. Separate path components using `/`. Examples: `us/my_db`, `eu/parent_db/child_db`, `global/db`. Can’t be used with `--secret`. If using a local [Fauna container](../../../#local), omit the region group. `-r`, `--role ` [Role](../../../../../../learn/security/roles/) used to run the command. Defaults to [`admin`](../../../../../../learn/security/roles/#built-in-roles). Can’t be used with `--secret`. Options `-h`, `--help` Show help. `--version` Show the Fauna CLI version. `--page-size ` Maximum number of databases to return. Defaults to `1000`. ## [](#examples)Examples ```cli # List all top-level databases. fauna database list # List all child databases directly # under the 'us/parent_db' database. fauna database list \ --database 'us/parent_db' # Get all child databases for the database # scoped to a secret. fauna database list \ --secret my-secret # List all top-level databases and output as JSON. fauna database list \ --json # List the first 10 top-level databases. fauna database list \ --page-size 10 ``` ## [](#aliases)Aliases ```cli-sig fauna db list ``` # `fauna export` ```cli-sig fauna export [flags] ``` Use `fauna export` commands to create and manage exports for a Fauna account. An export stores document data from a database or specified user-defined collections in a specified [AWS S3 bucket](https://aws.amazon.com/s3/). `fauna export` commands do not support `--secret`, `--local`, or Fauna containers. ## [](#available-commands)Available commands * [`fauna export create`](create/) * [`fauna export get`](get/) * [`fauna export list`](list/) # `fauna export create` ```cli-sig fauna export create ``` Start the export of a database or specified user-defined collections to a specified destination type. Currently only [AWS S3 bucket](https://aws.amazon.com/s3/) are supported as a destination type. See [`fauna export create s3`](s3/). `fauna export create` commands do not support `--secret`, `--local`, or Fauna containers. ## [](#available-commands)Available commands * [`fauna export create s3`](s3/) # `fauna export create s3` ```cli-sig fauna export create s3 [flags] ``` Start an export to an [AWS S3 bucket](https://aws.amazon.com/s3/). Export operations are asynchronous. Export requests aren’t idempotent. By default, the command outputs the export’s ID. To get additional information about the export as JSON, use `--json`. `fauna export create s3` does not support `--secret`, `--local`, or Fauna containers. ## [](#flags)Flags API `--account-key ` Fauna [account key](#reference:http/reference/account-api.adoc#section/Authentication) used for authentication. If used, you must also provide a `--database` and an optional `--role`. See [Account key authentication](../../../../#account-key). Can’t be used with `--user` or `--secret`. If `--account-key` and `--user` are specified, `--user` is ignored. `-u`, `--user ` CLI user to run the command as. You must first log in as the user using [`fauna login`](../../../login/). Defaults to [`default`](../../../../#default-user). See [Interactive login](../../../../#interactive). `-r`, `--role ` [Role](../../../../../../../learn/security/roles/) used to run the command. Defaults to [`admin`](../../../../../../../learn/security/roles/#built-in-roles). Can’t be used with `--secret`. `-d`, `--database ` Database, including the region group identifier and hierarchy, to export. Supports shorthand region group identifiers. Separate path components using `/`. Examples: `us/my_db`, `eu/parent_db/child_db`, `global/db`. [Child databases](../../../../../../../learn/data-model/databases/#child) are not included in the export. `--collection ` User-defined collections from the database to export. Pass values as a space-separated list. Example: `--collection Product Category`. If omitted, all user-defined collections are exported. You can’t export [system collections](../../../../../../../learn/data-model/collections/#system-coll). `--bucket ` (Required) Name of the S3 bucket where the export will be stored. `--path ` (Required) Path prefix for the destination S3 bucket. Separate subfolders using a slash (`/`). A trailing slash is supported but not required. `--format ` (Required) [Data format](../../../../../../../reference/http/reference/wire-protocol/#tagged) used to convert the database’s FQL document data to JSON. Accepts [`tagged`](../../../../../../../reference/http/reference/wire-protocol/#tagged) or [`simple`](../../../../../../../reference/http/reference/wire-protocol/#simple). Defaults to `simple`. Output `--color` Enable color formatting for output. Enabled by default. Use `--no-color` to disable. `--json` Output results as JSON. This flag doesn’t affect error output. `--quiet` Suppress all log messages except fatal errors. Output only command results. Overrides `--verbosity` and `--verbose-component`. Config `--config ` Path to a CLI [config file](../../../../#config) to use. If provided, must [specify a profile](../../../../#provide). `-p`, `--profile ` Profile from the CLI [config file](../../../../#config). A profile is a group of [CLI settings](../../../../#settings). Debug `--verbose-component ` Components to emit logs for. Overrides `--verbosity`. Accepts the following values: * `argv` * `config` * `creds` * `error` * `fetch` Pass values as a space-separated list. Example: `--verbose-component argv config`. `--verbosity ` Least critical log level to emit. Accepts integers ranging from `1` (fatal) to `5` (debug). Lower values represent more critical logs. Log messages with a level greater than this value are not logged. Options `-h`, `--help` Show help. `--version` Show the Fauna CLI version. `-w`, `--wait`, `--watch` Wait for the export to complete or fail before exiting. Polls for the export’s state using an exponential backoff strategy. Use '--max-wait' to set a timeout. `--max-wait ` Maximum wait time in minutes. Defaults to 120 minutes. ## [](#examples)Examples ```cli # Export all user-defined collections in # the 'us/parent_db/child_db' database. Store the export # in the 'fauna_exports/parent_db/child_db/2099-12-31' # path of the 'doc-example-bucket' S3 bucket. Format FQL data # using the simple data format. fauna export create s3 \ --database us/parent_db/child_db \ --bucket doc-example-bucket \ --path fauna_exports/parent_db/child_db/2099-12-31 # Export the 'Product' and 'Category' collections in # the 'us/parent_db/child_db' database. fauna export create s3 \ --database us/parent_db/child_db \ --collection Product Category \ --bucket doc-example-bucket \ --path fauna_exports/parent_db/child_db/2099-12-31 # Encode the export's FQL document data # using the 'tagged' format. fauna export create s3 \ --database us/parent_db/child_db \ --bucket doc-example-bucket \ --path fauna_exports/parent_db/child_db/2099-12-31 \ --format tagged # Wait for the export to complete or fail before exiting. # Waits up to 180 minutes. fauna export create s3 \ --database us/parent_db/child_db \ --bucket doc-example-bucket \ --path fauna_exports/parent_db/child_db/2099-12-31 \ --wait \ --max-wait 180 ``` # `fauna export get` ```cli-sig fauna export get [flags] ``` Get an export by its ID. By default, the command outputs the export’s information as YAML. To get the information as JSON, use `--json`. `fauna export get` does not support `--secret`, `--local`, or Fauna containers. ## [](#positional-arguments)Positional arguments `` (Required) ID of the export to retrieve. ## [](#flags)Flags Output `--color` Enable color formatting for output. Enabled by default. Use `--no-color` to disable. `--json` Output results as JSON. This flag doesn’t affect error output. `--quiet` Suppress all log messages except fatal errors. Output only command results. Overrides `--verbosity` and `--verbose-component`. Config `--config ` Path to a CLI [config file](../../../#config) to use. If provided, must [specify a profile](../../../#provide). `-p`, `--profile ` Profile from the CLI [config file](../../../#config). A profile is a group of [CLI settings](../../../#settings). Debug `--verbose-component ` Components to emit logs for. Overrides `--verbosity`. Accepts the following values: * `argv` * `config` * `creds` * `error` * `fetch` Pass values as a space-separated list. Example: `--verbose-component argv config`. `--verbosity ` Least critical log level to emit. Accepts integers ranging from `1` (fatal) to `5` (debug). Lower values represent more critical logs. Log messages with a level greater than this value are not logged. Options `-h`, `--help` Show help. `--version` Show the Fauna CLI version. API `--account-key ` Fauna [account key](#reference:http/reference/account-api.adoc#section/Authentication) used for authentication. If used, you must also provide a `--database` and an optional `--role`. See [Account key authentication](../../../#account-key). Can’t be used with `--user` or `--secret`. If `--account-key` and `--user` are specified, `--user` is ignored. `-u`, `--user ` CLI user to run the command as. You must first log in as the user using [`fauna login`](../../login/). Defaults to [`default`](../../../#default-user). See [Interactive login](../../../#interactive). `-r`, `--role ` [Role](../../../../../../learn/security/roles/) used to run the command. Defaults to [`admin`](../../../../../../learn/security/roles/#built-in-roles). Can’t be used with `--secret`. Options `-h`, `--help` Show help. `--version` Show the Fauna CLI version. `-w`, `--wait`, `--watch` Wait for the export to complete or fail before exiting. Polls for the export’s state using an exponential backoff strategy. Use '--max-wait' to set a timeout. `--max-wait ` Maximum wait time in minutes. Defaults to 120 minutes. ## [](#examples)Examples ```cli # Get an export with an ID of '123456789'. fauna export get 123456789 # Get an export as JSON. fauna export get 123456789 \ --json # Wait for the export to complete or fail before exiting. # Waits up to 180 minutes. fauna export get 123456789 \ --wait \ --max-wait 180 ``` # `fauna export list` ```cli-sig fauna export list [flags] ``` List exports. By default, the command outputs the export list as tab-separated values (TSV). To get the list as JSON, use `--json`. Exports are ordered by state and descending export ID. `fauna export list` does not support `--secret`, `--local`, or Fauna containers. ## [](#flags)Flags API `--account-key ` Fauna [account key](#reference:http/reference/account-api.adoc#section/Authentication) used for authentication. If used, you must also provide a `--database` and an optional `--role`. See [Account key authentication](../../../#account-key). Can’t be used with `--user` or `--secret`. If `--account-key` and `--user` are specified, `--user` is ignored. `-u`, `--user ` CLI user to run the command as. You must first log in as the user using [`fauna login`](../../login/). Defaults to [`default`](../../../#default-user). See [Interactive login](../../../#interactive). `-r`, `--role ` [Role](../../../../../../learn/security/roles/) used to run the command. Defaults to [`admin`](../../../../../../learn/security/roles/#built-in-roles). Can’t be used with `--secret`. `--max`, `--max-results ` Maximum number of exports to return per page. Accepts `1` to `1000`. Defaults to `100`. `--state ` Filter exports by their state. Accepts one or more of the following: * `Pending`: The export request has been received but is not yet in progress. * `InProgress`: The export is in progress. This includes copying the export files to the S3 bucket. * `Complete`: The export is complete. Export files are available in the S3 bucket. * `Failed`: There was an error processing the export. Pass values as a space-separated list. Example: `--state Pending Complete`. Output `--color` Enable color formatting for output. Enabled by default. Use `--no-color` to disable. `--json` Output results as JSON. This flag doesn’t affect error output. `--quiet` Suppress all log messages except fatal errors. Output only command results. Overrides `--verbosity` and `--verbose-component`. Config `--config ` Path to a CLI [config file](../../../#config) to use. If provided, must [specify a profile](../../../#provide). `-p`, `--profile ` Profile from the CLI [config file](../../../#config). A profile is a group of [CLI settings](../../../#settings). Debug `--verbose-component ` Components to emit logs for. Overrides `--verbosity`. Accepts the following values: * `argv` * `config` * `creds` * `error` * `fetch` Pass values as a space-separated list. Example: `--verbose-component argv config`. `--verbosity ` Least critical log level to emit. Accepts integers ranging from `1` (fatal) to `5` (debug). Lower values represent more critical logs. Log messages with a level greater than this value are not logged. Options `-h`, `--help` Show help. `--version` Show the Fauna CLI version. Options `-h`, `--help` Show help. `--version` Show the Fauna CLI version. ## [](#examples)Examples ```cli # List exports in TSV format. fauna export list # List exports in JSON format. fauna export list \ --json # List up to 50 exports. fauna export list \ --max-results 50 # List exports in the 'Pending' or # 'Complete' state. fauna export list \ --state Pending Complete ``` # `fauna local` ```cli-sig fauna local [flags] ``` Start a local [Fauna container](../../../../tools/docker/). Once started, use the `--local` flag to run CLI commands in the container. See [Use a Local Fauna container with the CLI](../../#local). To use this command, you must have [Docker](https://www.docker.com/) or similar software installed and running. The command uses health checks to determine whether the container is ready. The checks ping the container at regular intervals until a successful response is received or the maximum number of attempts is reached. You can configure the checks using `--interval` and `--max-attempts`. Use `--database` to optionally create a database in the container. Use `--typechecked`, `--protected`, and `--priority` to configure the database. When creating a database, use `--dir` to push a local directory of `.fsl` schema files to the database using [`fauna schema push`](../schema/push/). The schema is immediately applied to the database’s active schema with no prompts. ## [](#flags)Flags Output `--color` Enable color formatting for output. Enabled by default. Use `--no-color` to disable. `--json` Output results as JSON. This flag doesn’t affect error output. `--quiet` Suppress all log messages except fatal errors. Output only command results. Overrides `--verbosity` and `--verbose-component`. Config `--config ` Path to a CLI [config file](../../#config) to use. If provided, must [specify a profile](../../#provide). `-p`, `--profile ` Profile from the CLI [config file](../../#config). A profile is a group of [CLI settings](../../#settings). Debug `--verbose-component ` Components to emit logs for. Overrides `--verbosity`. Accepts the following values: * `argv` * `config` * `creds` * `error` * `fetch` Pass values as a space-separated list. Example: `--verbose-component argv config`. `--verbosity ` Least critical log level to emit. Accepts integers ranging from `1` (fatal) to `5` (debug). Lower values represent more critical logs. Log messages with a level greater than this value are not logged. Options `-h`, `--help` Show help. `--version` Show the Fauna CLI version. `--container-port ` Port inside the container where the Fauna instance listens for requests. Defaults to `8443`. `--host-port ` Port on your host machine that maps to the `--container-port`. The CLI and other external clients can send HTTP API requests to Fauna using this port. Defaults to `8443`. `--host-ip ` IP address to bind the container’s port to. Defaults to `0.0.0.0`. `--interval ` Interval, in milliseconds, between health check attempts. How often the CLI checks if the container is ready. Defaults to `10000`. Must be greater than or equal to `0`. `--max-attempts ` Maximum number of health check attempts allowed before container startup fails. Defaults to `100`. Must be greater than `0`. `--name ` Name for the container. Defaults to `faunadb`. `--pull` Pull the latest [`fauna/faunadb`](https://hub.docker.com/r/fauna/faunadb) image before starting the container. Defaults to true. Use `--no-pull` to disable. `--database ` Name of the database to create. Omit to create no database. `--typechecked` Enable [typechecking](../../../../../learn/query/static-typing/) for the database. Use `--no-typechecked` to disable. Defaults to enabled for databases in a Fauna container. Only valid if `--database` is set. `--protected` Enable [protected mode](../../../../../learn/schema/#protected-mode) for the database. Only valid if `--database` is set. `--priority ` User-defined priority for the database. Must be an integer. Only valid if `--database` is set. `--fsl-directory`, `--dir`, `--directory ` Path to a local directory containing `.fsl` files for the database. Recursively scans subdirectories. Defaults to the current directory (`.`). Only valid if `--database` is set. ## [](#examples)Examples ```cli # Start a local Fauna container with # default name and ports. fauna local # Start a container named 'local-fauna'. fauna local \ --name local-fauna # Map host port `1234` to container port `6789`. # Equivalent to `-p 1234:6789` in Docker. fauna local \ --host-port 1234 \ --container-port 6789 # Start a local Fauna container. # Create the 'my_db' database in the container. fauna local \ --database my_db # Start a local Fauna container. # Create the 'my_db' database in the container. # Immediately apply changes to the 'my_db' database's # active schema. fauna local \ --database my_db \ --dir /path/to/schema/dir ``` # `fauna login` ```cli-sig fauna login [flags] ``` Log in to Fauna using a web-based browser flow. This command is used to set up authentication using an [interactive login](../../#interactive). Use `--user` to specify a user to log in as. If omitted, `--user` defaults to [`default`](../../#default-user). The `--user` argument can be any string. `--user` is only used by the CLI to store and retrieve Fauna [account keys](#reference:http/reference/account-api.adoc#section/Authentication). The `--user` is not passed to Fauna itself. See [How interactive login works](../../#works). After logging in, you can authenticate other CLI commands by specifying the `--user`, a `--database`, and an optional `--role`. You can’t use this command to log in to or authenticate with a local [Fauna container](../../../../tools/docker/). See [local Fauna container authentication](../../#local). By default, [`fauna login`](./) runs and redirects to a local HTTP callback server at `127.0.0.1` to receive an authentication code from Fauna. This approach may not work in environments where opening a browser isn’t possible or environments with an isolated network stack, such as [Windows Subsystem for Linux (WSL)](https://learn.microsoft.com/en-us/windows/wsl/about) or other virtualized environments. In these cases, use `--no-redirect` to manually generate and provide an authentication code without redirects and a local callback server. ## [](#flags)Flags Output `--color` Enable color formatting for output. Enabled by default. Use `--no-color` to disable. `--json` Output results as JSON. This flag doesn’t affect error output. `--quiet` Suppress all log messages except fatal errors. Output only command results. Overrides `--verbosity` and `--verbose-component`. Config `--config ` Path to a CLI [config file](../../#config) to use. If provided, must [specify a profile](../../#provide). `-p`, `--profile ` Profile from the CLI [config file](../../#config). A profile is a group of [CLI settings](../../#settings). Debug `--verbose-component ` Components to emit logs for. Overrides `--verbosity`. Accepts the following values: * `argv` * `config` * `creds` * `error` * `fetch` Pass values as a space-separated list. Example: `--verbose-component argv config`. `--verbosity ` Least critical log level to emit. Accepts integers ranging from `1` (fatal) to `5` (debug). Lower values represent more critical logs. Log messages with a level greater than this value are not logged. Options `-h`, `--help` Show help. `--version` Show the Fauna CLI version. `-r`, `--redirect` Log in using a local callback server (default). Use `--no-redirect` if you are unable to open a browser on your local machine. See [Log in without redirects](../../#no-redirect). `-u`, `--user ` User to log in as. Defaults to [`default`](../../#default-user). ## [](#examples)Examples ```cli # Log in as the 'default' user. fauna login # Log in as the `john_doe` user. fauna login \ --user john_doe # Log in without redirecting to a callback server. fauna login \ --user john_doe \ --no-redirect ``` # `fauna query` ```cli-sig fauna eval [] [flags] ``` Run a provided [FQL query](../../../../../learn/query/). You can provide the query as a positional `` query string. To read the query from `stdin`, use `-` as the query string. You can also use `--input` to provide the query as a file. Queries run in the context of a single database, specified using `--database` or `--secret`. If using `--secret`, queries run in the database to which the secret is scoped. By default, FQL v10 query results are output as copy-pastable FQL. To get results in JSON, use `--format json` or `--json`. JSON results are encoded using the [simple data format](../../../../../reference/http/reference/wire-protocol/#simple). ## [](#positional-arguments)Positional arguments `` FQL query string to run. Use `-` to read from `stdin`. ## [](#flags)Flags API `-u`, `--user ` CLI user to run the command as. You must first log in as the user using [`fauna login`](../login/). Defaults to [`default`](../../#default-user). See [Interactive login](../../#interactive). `--local` Use a local [Fauna container](../../../../tools/docker/). If not otherwise specified, this flag sets: * `--url` to [http://0.0.0.0:8443](http://0.0.0.0:8443) * `--secret` to `secret` `--url ` URL for [Core HTTP API requests](../../../../../reference/http/reference/core-api/) made by the command. Defaults to [https://db.fauna.com](https://db.fauna.com). `--secret ` [Secret](../../../../../learn/security/authentication/) used for authentication. Supports [scoped keys](../../../../../learn/security/keys/#scoped-keys). The command runs in the database to which the secret is scoped. If the secret is a [key secret](../../../../../learn/security/keys/) with the `admin` role, you can pass this flag with `--database` and an optional `--role` to create and use a [scoped key](../../../../../learn/security/keys/#scoped-keys) that impersonates a role on a child database. See [Scoped keys](../../#scoped-keys). `--account-key ` Fauna [account key](#reference:http/reference/account-api.adoc#section/Authentication) used for authentication. If used, you must also provide a `--database` and an optional `--role`. See [Account key authentication](../../#account-key). Can’t be used with `--user` or `--secret`. If `--account-key` and `--user` are specified, `--user` is ignored. `-d`, `--database ` Database, including the region group identifier and hierarchy, to run the command in. Supports shorthand region group identifiers. Separate path components using `/`. Examples: `us/my_db`, `eu/parent_db/child_db`, `global/db`. Can’t be used with `--secret`. If using a local [Fauna container](../../#local), omit the region group. `-r`, `--role ` [Role](../../../../../learn/security/roles/) used to run the command. Defaults to [`admin`](../../../../../learn/security/roles/#built-in-roles). Can’t be used with `--secret`. `-v`, `--api-version ` FQL version to use. Accepts one of the following: * `10` (Default): [FQL v10](../../../../../reference/fql/quick-look/) * `4`: [FQL v4](https://docs.faunadb.org/fauna/v4/api/fql/cheat_sheet/) `-f`, `--format ` Output format for query results. Accepts `fql` (Default) and `json`. Only applies to FQL v10 queries. `--json` overrides `--format`. `--typecheck` Enable query [typechecking](../../../../../learn/query/static-typing/). Only applies to FQL v10 queries. If omitted, uses the typechecking setting of the database. `--timeout ` Maximum query runtime, in milliseconds. `--performance-hints` Output [performance hints](../../../../../learn/query/performance-hints/). Sets `--include summary`. Only applies to FQL v10 queries. If no performance hints are returned, no hints are output. Output `--color` Enable color formatting for output. Enabled by default. Use `--no-color` to disable. `--json` Output results as JSON. This flag doesn’t affect error output. `--quiet` Suppress all log messages except fatal errors. Output only command results. Overrides `--verbosity` and `--verbose-component`. Config `--config ` Path to a CLI [config file](../../#config) to use. If provided, must [specify a profile](../../#provide). `-p`, `--profile ` Profile from the CLI [config file](../../#config). A profile is a group of [CLI settings](../../#settings). Debug `--verbose-component ` Components to emit logs for. Overrides `--verbosity`. Accepts the following values: * `argv` * `config` * `creds` * `error` * `fetch` Pass values as a space-separated list. Example: `--verbose-component argv config`. `--verbosity ` Least critical log level to emit. Accepts integers ranging from `1` (fatal) to `5` (debug). Lower values represent more critical logs. Log messages with a level greater than this value are not logged. Options `-h`, `--help` Show help. `--version` Show the Fauna CLI version. `--include ` Include additional query response data in the output. Only applies to FQL v10 queries. Accepts the following values: * `all` * `none` * `txnTs` * `schemaVersion` * `summary` * `queryTags` * `stats` Defaults to only `summary`. Pass values as a space-separated list, such as `--include summary queryTags`. Each value corresponds to a field from the [query response](../../../../../reference/http/reference/core-api/#operation/query). `-i`, `--input ` Path to a file containing an FQL query to run. Can’t be used with the `` positional argument. The file can use any file extension. `.fql` is recommended. `-o`, `--output ` Path to a file where query results are written. If omitted, writes to `stdout`. ## [](#examples)Examples ```cli # Run the query in the 'us/my_db' database and write # the results to stdout. fauna query "Collection.all()" \ --database us/my_db # Run the query in the 'us/my_db' database using the # server role. fauna query "Collection.all()" \ --database us/my_db \ --role server # Run the query in the database scoped to a secret. fauna query "Collection.all()" \ --secret my-secret # Run the query from a file. fauna query \ --input /path/to/query.fql \ --database us/my_db # Run the query from stdin. echo "1 + 1" | fauna query - \ --database us/my_db # Run the query and write the results to a file. fauna query \ --input /path/to/query.fql \ --output /tmp/result.json \ --database us/my_db ``` ## [](#aliases)Aliases ```cli-sig fauna eval ``` # `fauna schema` ```cli-sig fauna schema [flags] ``` Use `fauna schema` commands to create and manage [database schema](../../../../../learn/schema/) as `.fsl` schema files. ## [](#available-commands)Available commands * [`fauna schema abandon`](abandon/) * [`fauna schema commit`](commit/) * [`fauna schema diff`](diff/) * [`fauna schema pull`](pull/) * [`fauna schema push`](push/) * [`fauna schema status`](status/) # `fauna schema abandon` ```cli-sig fauna schema abandon [flags] ``` Abandon a database’s [staged schema](../../../../../../learn/schema/manage-schema/#staged). You run this command as part of a [staged schema change](../../../../../../learn/schema/manage-schema/#staged). You can abandon a staged schema at any time, including a schema with the `ready` status using [status](../status/). This is useful when you want to discard changes that are no longer needed or failed during staging. The command runs in the context of a single database, specified using `--database` or `--secret`. If using `--secret`, the command runs in the database to which the secret is scoped. If the database has no staged schema, the command returns an error. ## [](#flags)Flags API `-u`, `--user ` CLI user to run the command as. You must first log in as the user using [`fauna login`](../../login/). Defaults to [`default`](../../../#default-user). See [Interactive login](../../../#interactive). `--local` Use a local [Fauna container](../../../../../tools/docker/). If not otherwise specified, this flag sets: * `--url` to [http://0.0.0.0:8443](http://0.0.0.0:8443) * `--secret` to `secret` `--url ` URL for [Core HTTP API requests](../../../../../../reference/http/reference/core-api/) made by the command. Defaults to [https://db.fauna.com](https://db.fauna.com). `--secret ` [Secret](../../../../../../learn/security/authentication/) used for authentication. Supports [scoped keys](../../../../../../learn/security/keys/#scoped-keys). The command runs in the database to which the secret is scoped. If the secret is a [key secret](../../../../../../learn/security/keys/) with the `admin` role, you can pass this flag with `--database` and an optional `--role` to create and use a [scoped key](../../../../../../learn/security/keys/#scoped-keys) that impersonates a role on a child database. See [Scoped keys](../../../#scoped-keys). `--account-key ` Fauna [account key](#reference:http/reference/account-api.adoc#section/Authentication) used for authentication. If used, you must also provide a `--database` and an optional `--role`. See [Account key authentication](../../../#account-key). Can’t be used with `--user` or `--secret`. If `--account-key` and `--user` are specified, `--user` is ignored. `-d`, `--database ` Database, including the region group identifier and hierarchy, to run the command in. Supports shorthand region group identifiers. Separate path components using `/`. Examples: `us/my_db`, `eu/parent_db/child_db`, `global/db`. Can’t be used with `--secret`. If using a local [Fauna container](../../../#local), omit the region group. `-r`, `--role ` [Role](../../../../../../learn/security/roles/) used to run the command. Defaults to [`admin`](../../../../../../learn/security/roles/#built-in-roles). Can’t be used with `--secret`. Output `--color` Enable color formatting for output. Enabled by default. Use `--no-color` to disable. `--json` Output results as JSON. This flag doesn’t affect error output. `--quiet` Suppress all log messages except fatal errors. Output only command results. Overrides `--verbosity` and `--verbose-component`. Config `--config ` Path to a CLI [config file](../../../#config) to use. If provided, must [specify a profile](../../../#provide). `-p`, `--profile ` Profile from the CLI [config file](../../../#config). A profile is a group of [CLI settings](../../../#settings). Debug `--verbose-component ` Components to emit logs for. Overrides `--verbosity`. Accepts the following values: * `argv` * `config` * `creds` * `error` * `fetch` Pass values as a space-separated list. Example: `--verbose-component argv config`. `--verbosity ` Least critical log level to emit. Accepts integers ranging from `1` (fatal) to `5` (debug). Lower values represent more critical logs. Log messages with a level greater than this value are not logged. Options `-h`, `--help` Show help. `--version` Show the Fauna CLI version. `--input` Prompt for user input. Defaults to true. To disable prompts, use `--no-input`. ## [](#examples)Examples ```cli # Abandon staged schema for the 'us/my_db' database. fauna schema abandon \ --database us/my_db # Abandon staged schema for the database scoped to a secret fauna schema abandon \ --secret my-secret # Run the command without input prompts. fauna schema abandon \ --database us/my_db \ --no-input ``` # `fauna schema commit` ```cli-sig fauna schema commit [flags] ``` Apply a [staged schema](../../../../../../learn/schema/manage-schema/#staged) to a database. You run this command as part of a [staged schema change](../../../../../../learn/schema/manage-schema/#staged). You can only apply a staged schema that has a status of `ready`. You can check a database’s staged schema status using [`fauna schema status`](../status/). The command runs in the context of a single database, specified using `--database` or `--secret`. If using `--secret`, the command runs in the database to which the secret is scoped. If the database has no staged schema, the command returns an error. ## [](#flags)Flags API `-u`, `--user ` CLI user to run the command as. You must first log in as the user using [`fauna login`](../../login/). Defaults to [`default`](../../../#default-user). See [Interactive login](../../../#interactive). `--local` Use a local [Fauna container](../../../../../tools/docker/). If not otherwise specified, this flag sets: * `--url` to [http://0.0.0.0:8443](http://0.0.0.0:8443) * `--secret` to `secret` `--url ` URL for [Core HTTP API requests](../../../../../../reference/http/reference/core-api/) made by the command. Defaults to [https://db.fauna.com](https://db.fauna.com). `--secret ` [Secret](../../../../../../learn/security/authentication/) used for authentication. Supports [scoped keys](../../../../../../learn/security/keys/#scoped-keys). The command runs in the database to which the secret is scoped. If the secret is a [key secret](../../../../../../learn/security/keys/) with the `admin` role, you can pass this flag with `--database` and an optional `--role` to create and use a [scoped key](../../../../../../learn/security/keys/#scoped-keys) that impersonates a role on a child database. See [Scoped keys](../../../#scoped-keys). `--account-key ` Fauna [account key](#reference:http/reference/account-api.adoc#section/Authentication) used for authentication. If used, you must also provide a `--database` and an optional `--role`. See [Account key authentication](../../../#account-key). Can’t be used with `--user` or `--secret`. If `--account-key` and `--user` are specified, `--user` is ignored. `-d`, `--database ` Database, including the region group identifier and hierarchy, to run the command in. Supports shorthand region group identifiers. Separate path components using `/`. Examples: `us/my_db`, `eu/parent_db/child_db`, `global/db`. Can’t be used with `--secret`. If using a local [Fauna container](../../../#local), omit the region group. `-r`, `--role ` [Role](../../../../../../learn/security/roles/) used to run the command. Defaults to [`admin`](../../../../../../learn/security/roles/#built-in-roles). Can’t be used with `--secret`. Output `--color` Enable color formatting for output. Enabled by default. Use `--no-color` to disable. `--json` Output results as JSON. This flag doesn’t affect error output. `--quiet` Suppress all log messages except fatal errors. Output only command results. Overrides `--verbosity` and `--verbose-component`. Config `--config ` Path to a CLI [config file](../../../#config) to use. If provided, must [specify a profile](../../../#provide). `-p`, `--profile ` Profile from the CLI [config file](../../../#config). A profile is a group of [CLI settings](../../../#settings). Debug `--verbose-component ` Components to emit logs for. Overrides `--verbosity`. Accepts the following values: * `argv` * `config` * `creds` * `error` * `fetch` Pass values as a space-separated list. Example: `--verbose-component argv config`. `--verbosity ` Least critical log level to emit. Accepts integers ranging from `1` (fatal) to `5` (debug). Lower values represent more critical logs. Log messages with a level greater than this value are not logged. Options `-h`, `--help` Show help. `--version` Show the Fauna CLI version. `--input` Prompt for user input. Defaults to true. To disable prompts, use `--no-input`. ## [](#examples)Examples ```cli # Commit staged schema for the 'us/my_db' database. fauna schema commit \ --database us/my_db # Commit staged schema for the database scoped to a secret. fauna schema commit \ --secret my-secret # Run the command without prompts. fauna schema commit \ --database us/my_db \ --no-input ``` # `fauna schema diff` ```cli-sig fauna schema diff [flags] ``` Show the diff between a database’s local, [staged](../../../../../../learn/schema/manage-schema/#staged), or active schema. The command runs in the context of a single database, specified using `--database` or `--secret`. If using `--secret`, the command runs in the database to which the secret is scoped. By default, the command compares the database’s staged schema to schema in a local directory, specified using `--dir`. If no schema is staged, it compares the database’s active schema to the local schema. Use `--active` to compare the database’s active schema to the local schema, regardless of whether schema is staged. Use `--staged` to compare the database’s active schema to its local schema. ## [](#flags)Flags API `-u`, `--user ` CLI user to run the command as. You must first log in as the user using [`fauna login`](../../login/). Defaults to [`default`](../../../#default-user). See [Interactive login](../../../#interactive). `--local` Use a local [Fauna container](../../../../../tools/docker/). If not otherwise specified, this flag sets: * `--url` to [http://0.0.0.0:8443](http://0.0.0.0:8443) * `--secret` to `secret` `--url ` URL for [Core HTTP API requests](../../../../../../reference/http/reference/core-api/) made by the command. Defaults to [https://db.fauna.com](https://db.fauna.com). `--secret ` [Secret](../../../../../../learn/security/authentication/) used for authentication. Supports [scoped keys](../../../../../../learn/security/keys/#scoped-keys). The command runs in the database to which the secret is scoped. If the secret is a [key secret](../../../../../../learn/security/keys/) with the `admin` role, you can pass this flag with `--database` and an optional `--role` to create and use a [scoped key](../../../../../../learn/security/keys/#scoped-keys) that impersonates a role on a child database. See [Scoped keys](../../../#scoped-keys). `--account-key ` Fauna [account key](#reference:http/reference/account-api.adoc#section/Authentication) used for authentication. If used, you must also provide a `--database` and an optional `--role`. See [Account key authentication](../../../#account-key). Can’t be used with `--user` or `--secret`. If `--account-key` and `--user` are specified, `--user` is ignored. `-d`, `--database ` Database, including the region group identifier and hierarchy, to run the command in. Supports shorthand region group identifiers. Separate path components using `/`. Examples: `us/my_db`, `eu/parent_db/child_db`, `global/db`. Can’t be used with `--secret`. If using a local [Fauna container](../../../#local), omit the region group. `-r`, `--role ` [Role](../../../../../../learn/security/roles/) used to run the command. Defaults to [`admin`](../../../../../../learn/security/roles/#built-in-roles). Can’t be used with `--secret`. Output `--color` Enable color formatting for output. Enabled by default. Use `--no-color` to disable. `--json` Output results as JSON. This flag doesn’t affect error output. `--quiet` Suppress all log messages except fatal errors. Output only command results. Overrides `--verbosity` and `--verbose-component`. Config `--config ` Path to a CLI [config file](../../../#config) to use. If provided, must [specify a profile](../../../#provide). `-p`, `--profile ` Profile from the CLI [config file](../../../#config). A profile is a group of [CLI settings](../../../#settings). Debug `--verbose-component ` Components to emit logs for. Overrides `--verbosity`. Accepts the following values: * `argv` * `config` * `creds` * `error` * `fetch` Pass values as a space-separated list. Example: `--verbose-component argv config`. `--verbosity ` Least critical log level to emit. Accepts integers ranging from `1` (fatal) to `5` (debug). Lower values represent more critical logs. Log messages with a level greater than this value are not logged. Options `-h`, `--help` Show help. `--version` Show the Fauna CLI version. `--staged` Show the diff between the active and [staged schema](../../../../../../learn/schema/manage-schema/#staged). `--text` Show a text diff that contains line-by-line changes, including comments and whitespace. `--active` Show the diff between the active and local schema. `--fsl-directory`, `--dir`, `--directory ` Path to a local directory containing `.fsl` files for the database. Recursively scans subdirectories. Defaults to the current directory (`.`). ## [](#examples)Examples ```cli # Compare the database's staged schema to the local schema. # If no schema is staged, compare the database's active # schema to the local schema. fauna schema diff \ --database us/my_db \ --dir /path/to/schema/dir # Compare the active schema of the database scoped to a # secret to the local schema. fauna schema diff \ --secret my-secret \ --dir /path/to/schema/dir \ --active # Compare the 'us/my_db' database's active schema to the # local schema. fauna schema diff \ --database us/my_db \ --dir /path/to/schema/dir \ --active # Compare the 'us/my_db' database's active schema to its # staged schema. fauna schema diff \ --database us/my_db \ --dir /path/to/schema/dir \ --staged # Show a text diff instead of a semantic diff. fauna schema diff \ --database us/my_db \ --dir /path/to/schema/dir \ --text ``` # `fauna schema pull` ```cli-sig fauna schema pull [flags] ``` Pull a database’s remote `.fsl` schema files into a local directory. By default, pulls the database’s staged schema. If the database has no staged schema, the command pulls the database’s active schema by default. To explicitly pull the database’s active schema, use `--active`. The command runs in the context of a single database, specified using `--database` or `--secret`. If using `--secret`, the command runs in the database to which the secret is scoped. Specify the local directory using `--dir`. To delete any schema files in the local directory that are not part of the pulled database schema, use `--delete`. ## [](#flags)Flags API `-u`, `--user ` CLI user to run the command as. You must first log in as the user using [`fauna login`](../../login/). Defaults to [`default`](../../../#default-user). See [Interactive login](../../../#interactive). `--local` Use a local [Fauna container](../../../../../tools/docker/). If not otherwise specified, this flag sets: * `--url` to [http://0.0.0.0:8443](http://0.0.0.0:8443) * `--secret` to `secret` `--url ` URL for [Core HTTP API requests](../../../../../../reference/http/reference/core-api/) made by the command. Defaults to [https://db.fauna.com](https://db.fauna.com). `--secret ` [Secret](../../../../../../learn/security/authentication/) used for authentication. Supports [scoped keys](../../../../../../learn/security/keys/#scoped-keys). The command runs in the database to which the secret is scoped. If the secret is a [key secret](../../../../../../learn/security/keys/) with the `admin` role, you can pass this flag with `--database` and an optional `--role` to create and use a [scoped key](../../../../../../learn/security/keys/#scoped-keys) that impersonates a role on a child database. See [Scoped keys](../../../#scoped-keys). `--account-key ` Fauna [account key](#reference:http/reference/account-api.adoc#section/Authentication) used for authentication. If used, you must also provide a `--database` and an optional `--role`. See [Account key authentication](../../../#account-key). Can’t be used with `--user` or `--secret`. If `--account-key` and `--user` are specified, `--user` is ignored. `-d`, `--database ` Database, including the region group identifier and hierarchy, to run the command in. Supports shorthand region group identifiers. Separate path components using `/`. Examples: `us/my_db`, `eu/parent_db/child_db`, `global/db`. Can’t be used with `--secret`. If using a local [Fauna container](../../../#local), omit the region group. `-r`, `--role ` [Role](../../../../../../learn/security/roles/) used to run the command. Defaults to [`admin`](../../../../../../learn/security/roles/#built-in-roles). Can’t be used with `--secret`. Output `--color` Enable color formatting for output. Enabled by default. Use `--no-color` to disable. `--json` Output results as JSON. This flag doesn’t affect error output. `--quiet` Suppress all log messages except fatal errors. Output only command results. Overrides `--verbosity` and `--verbose-component`. Config `--config ` Path to a CLI [config file](../../../#config) to use. If provided, must [specify a profile](../../../#provide). `-p`, `--profile ` Profile from the CLI [config file](../../../#config). A profile is a group of [CLI settings](../../../#settings). Debug `--verbose-component ` Components to emit logs for. Overrides `--verbosity`. Accepts the following values: * `argv` * `config` * `creds` * `error` * `fetch` Pass values as a space-separated list. Example: `--verbose-component argv config`. `--verbosity ` Least critical log level to emit. Accepts integers ranging from `1` (fatal) to `5` (debug). Lower values represent more critical logs. Log messages with a level greater than this value are not logged. Options `-h`, `--help` Show help. `--version` Show the Fauna CLI version. `--delete` Delete `.fsl` files in the local directory that are not part of the pulled schema. `--active` Pull the database’s active schema files. Defaults to false. If omitted, pull the database’s [staged schema](../../../../../../learn/schema/manage-schema/#staged), if available. `--fsl-directory`, `--dir`, `--directory ` Path to a local directory containing `.fsl` files for the database. Recursively scans subdirectories. Defaults to the current directory (`.`). ## [](#examples)Examples ```cli # Pull the 'us/my_db' database's staged schema. # If the database has no staged schema, pull the # active schema. fauna schema pull \ --database us/my_db \ --dir /path/to/schema/dir # Pull the staged schema for the database scoped # to a secret. fauna schema pull \ --secret my-secret \ --dir /path/to/schema/dir # Pull the 'us/my_db' database's active schema. fauna schema pull \ --database us/my_db \ --dir /path/to/schema/dir \ --active # Delete `.fsl` files in the local directory # that are not part of the pulled schema. fauna schema pull \ --database us/my_db \ --dir /path/to/schema/dir \ --delete ``` # `fauna schema push` ```cli-sig fauna schema push [flags] ``` Push a local directory of `.fsl` schema files to a database. By default, [stages the schema](../../../../../../learn/schema/manage-schema/#staged). The command runs in the context of a single database, specified using `--database` or `--secret`. If using `--secret`, the command runs in the database to which the secret is scoped. Specify the local directory using `--dir`. Use `--active` to immediately apply the local schema to the database’s active schema. This skips staging the schema. However, If the local schema includes [index definition](../../../../../../reference/fql-api/collection/indexes/) changes, related indexes may be temporarily unavailable due to [index builds](../../../../../../learn/data-model/indexes/#builds). ## [](#flags)Flags API `-u`, `--user ` CLI user to run the command as. You must first log in as the user using [`fauna login`](../../login/). Defaults to [`default`](../../../#default-user). See [Interactive login](../../../#interactive). `--local` Use a local [Fauna container](../../../../../tools/docker/). If not otherwise specified, this flag sets: * `--url` to [http://0.0.0.0:8443](http://0.0.0.0:8443) * `--secret` to `secret` `--url ` URL for [Core HTTP API requests](../../../../../../reference/http/reference/core-api/) made by the command. Defaults to [https://db.fauna.com](https://db.fauna.com). `--secret ` [Secret](../../../../../../learn/security/authentication/) used for authentication. Supports [scoped keys](../../../../../../learn/security/keys/#scoped-keys). The command runs in the database to which the secret is scoped. If the secret is a [key secret](../../../../../../learn/security/keys/) with the `admin` role, you can pass this flag with `--database` and an optional `--role` to create and use a [scoped key](../../../../../../learn/security/keys/#scoped-keys) that impersonates a role on a child database. See [Scoped keys](../../../#scoped-keys). `--account-key ` Fauna [account key](#reference:http/reference/account-api.adoc#section/Authentication) used for authentication. If used, you must also provide a `--database` and an optional `--role`. See [Account key authentication](../../../#account-key). Can’t be used with `--user` or `--secret`. If `--account-key` and `--user` are specified, `--user` is ignored. `-d`, `--database ` Database, including the region group identifier and hierarchy, to run the command in. Supports shorthand region group identifiers. Separate path components using `/`. Examples: `us/my_db`, `eu/parent_db/child_db`, `global/db`. Can’t be used with `--secret`. If using a local [Fauna container](../../../#local), omit the region group. `-r`, `--role ` [Role](../../../../../../learn/security/roles/) used to run the command. Defaults to [`admin`](../../../../../../learn/security/roles/#built-in-roles). Can’t be used with `--secret`. Output `--color` Enable color formatting for output. Enabled by default. Use `--no-color` to disable. `--json` Output results as JSON. This flag doesn’t affect error output. `--quiet` Suppress all log messages except fatal errors. Output only command results. Overrides `--verbosity` and `--verbose-component`. Config `--config ` Path to a CLI [config file](../../../#config) to use. If provided, must [specify a profile](../../../#provide). `-p`, `--profile ` Profile from the CLI [config file](../../../#config). A profile is a group of [CLI settings](../../../#settings). Debug `--verbose-component ` Components to emit logs for. Overrides `--verbosity`. Accepts the following values: * `argv` * `config` * `creds` * `error` * `fetch` Pass values as a space-separated list. Example: `--verbose-component argv config`. `--verbosity ` Least critical log level to emit. Accepts integers ranging from `1` (fatal) to `5` (debug). Lower values represent more critical logs. Log messages with a level greater than this value are not logged. Options `-h`, `--help` Show help. `--version` Show the Fauna CLI version. `--input` Prompt for user input. Defaults to true. To disable prompts, use `--no-input`. `--active` Immediately apply the local schema to the database’s active schema. Skip [staging the schema](../../../../../../learn/schema/manage-schema/#staged). If the local schema includes [index definition](../../../../../../reference/fql-api/collection/indexes/) changes, related indexes may be temporarily unavailable due to [index builds](../../../../../../learn/data-model/indexes/#builds). `--fsl-directory`, `--dir`, `--directory ` Path to a local directory containing `.fsl` files for the database. Recursively scans subdirectories. Defaults to the current directory (`.`). ## [](#examples)Examples ```cli # Stage schema changes for the 'us/my_db' database. # If schema is already staged, replace the staged schema. fauna schema push \ --database us/my_db \ --dir /path/to/schema/dir # Stage schema changes for the database scoped to a secret. # If schema is already staged, replace the staged schema. fauna schema push \ --secret my-secret \ --dir /path/to/schema/dir # Immediately apply changes to the 'us/my_db' database's # active schema. fauna schema push \ --database us/my_db \ --dir /path/to/schema/dir \ --active # Run the command without prompts fauna schema push \ --database us/my_db \ --dir /path/to/schema/dir \ --no-input ``` # `fauna schema status` ```cli-sig fauna schema status [flags] ``` Shows a database’s [staged schema status](../../../../../../learn/schema/manage-schema/#staged). The status indicates whether [index builds](../../../../../../learn/data-model/indexes/#builds) for the schema changes are complete. The commands also outputs diffs between the database’s * Active and staged schema * Active and local schema in the `--dir` The command runs in the context of a single database, specified using `--database` or `--secret`. If using `--secret`, the command runs in the database to which the secret is scoped. ## [](#flags)Flags API `-u`, `--user ` CLI user to run the command as. You must first log in as the user using [`fauna login`](../../login/). Defaults to [`default`](../../../#default-user). See [Interactive login](../../../#interactive). `--local` Use a local [Fauna container](../../../../../tools/docker/). If not otherwise specified, this flag sets: * `--url` to [http://0.0.0.0:8443](http://0.0.0.0:8443) * `--secret` to `secret` `--url ` URL for [Core HTTP API requests](../../../../../../reference/http/reference/core-api/) made by the command. Defaults to [https://db.fauna.com](https://db.fauna.com). `--secret ` [Secret](../../../../../../learn/security/authentication/) used for authentication. Supports [scoped keys](../../../../../../learn/security/keys/#scoped-keys). The command runs in the database to which the secret is scoped. If the secret is a [key secret](../../../../../../learn/security/keys/) with the `admin` role, you can pass this flag with `--database` and an optional `--role` to create and use a [scoped key](../../../../../../learn/security/keys/#scoped-keys) that impersonates a role on a child database. See [Scoped keys](../../../#scoped-keys). `--account-key ` Fauna [account key](#reference:http/reference/account-api.adoc#section/Authentication) used for authentication. If used, you must also provide a `--database` and an optional `--role`. See [Account key authentication](../../../#account-key). Can’t be used with `--user` or `--secret`. If `--account-key` and `--user` are specified, `--user` is ignored. `-d`, `--database ` Database, including the region group identifier and hierarchy, to run the command in. Supports shorthand region group identifiers. Separate path components using `/`. Examples: `us/my_db`, `eu/parent_db/child_db`, `global/db`. Can’t be used with `--secret`. If using a local [Fauna container](../../../#local), omit the region group. `-r`, `--role ` [Role](../../../../../../learn/security/roles/) used to run the command. Defaults to [`admin`](../../../../../../learn/security/roles/#built-in-roles). Can’t be used with `--secret`. Output `--color` Enable color formatting for output. Enabled by default. Use `--no-color` to disable. `--json` Output results as JSON. This flag doesn’t affect error output. `--quiet` Suppress all log messages except fatal errors. Output only command results. Overrides `--verbosity` and `--verbose-component`. Config `--config ` Path to a CLI [config file](../../../#config) to use. If provided, must [specify a profile](../../../#provide). `-p`, `--profile ` Profile from the CLI [config file](../../../#config). A profile is a group of [CLI settings](../../../#settings). Debug `--verbose-component ` Components to emit logs for. Overrides `--verbosity`. Accepts the following values: * `argv` * `config` * `creds` * `error` * `fetch` Pass values as a space-separated list. Example: `--verbose-component argv config`. `--verbosity ` Least critical log level to emit. Accepts integers ranging from `1` (fatal) to `5` (debug). Lower values represent more critical logs. Log messages with a level greater than this value are not logged. Options `-h`, `--help` Show help. `--version` Show the Fauna CLI version. `--fsl-directory`, `--dir`, `--directory ` Path to a local directory containing `.fsl` files for the database. Recursively scans subdirectories. Defaults to the current directory (`.`). ## [](#examples)Examples ```cli # Get the staged schema status for the 'us/my_db' database. fauna schema status \ --database us/my_db # Get the staged schema status for the database # scoped to a secret. fauna schema status \ --secret my-secret ``` # `fauna shell` ```cli-sig fauna shell [flags] ``` Run [FQL queries](../../../../../learn/query/) in an interactive REPL. Queries run in the context of a single database, specified using `--database` or `--secret`. If using `--secret`, queries run in the database to which the secret is scoped. By default, FQL v10 query results are output as copy-pastable FQL. To get results in JSON, use `--format json` or `--json`. JSON results are encoded from FQL using the [simple data format](../../../../../reference/http/reference/wire-protocol/#simple). REPL session history is stored at: * Linux, macOS, Unix: `~/.fauna/history` * Windows: `%userprofile%\.fauna\history` ## [](#flags)Flags API `-u`, `--user ` CLI user to run the command as. You must first log in as the user using [`fauna login`](../login/). Defaults to [`default`](../../#default-user). See [Interactive login](../../#interactive). `--local` Use a local [Fauna container](../../../../tools/docker/). If not otherwise specified, this flag sets: * `--url` to [http://0.0.0.0:8443](http://0.0.0.0:8443) * `--secret` to `secret` `--url ` URL for [Core HTTP API requests](../../../../../reference/http/reference/core-api/) made by the command. Defaults to [https://db.fauna.com](https://db.fauna.com). `--secret ` [Secret](../../../../../learn/security/authentication/) used for authentication. Supports [scoped keys](../../../../../learn/security/keys/#scoped-keys). The command runs in the database to which the secret is scoped. If the secret is a [key secret](../../../../../learn/security/keys/) with the `admin` role, you can pass this flag with `--database` and an optional `--role` to create and use a [scoped key](../../../../../learn/security/keys/#scoped-keys) that impersonates a role on a child database. See [Scoped keys](../../#scoped-keys). `--account-key ` Fauna [account key](#reference:http/reference/account-api.adoc#section/Authentication) used for authentication. If used, you must also provide a `--database` and an optional `--role`. See [Account key authentication](../../#account-key). Can’t be used with `--user` or `--secret`. If `--account-key` and `--user` are specified, `--user` is ignored. `-d`, `--database ` Database, including the region group identifier and hierarchy, to run the command in. Supports shorthand region group identifiers. Separate path components using `/`. Examples: `us/my_db`, `eu/parent_db/child_db`, `global/db`. Can’t be used with `--secret`. If using a local [Fauna container](../../#local), omit the region group. `-r`, `--role ` [Role](../../../../../learn/security/roles/) used to run the command. Defaults to [`admin`](../../../../../learn/security/roles/#built-in-roles). Can’t be used with `--secret`. `-v`, `--api-version ` FQL version to use. Accepts one of the following: * `10` (Default): [FQL v10](../../../../../reference/fql/quick-look/) * `4`: [FQL v4](https://docs.faunadb.org/fauna/v4/api/fql/cheat_sheet/) `-f`, `--format ` Output format for query results. Accepts `fql` (Default) and `json`. Only applies to FQL v10 queries. `--json` overrides `--format`. `--typecheck` Enable query [typechecking](../../../../../learn/query/static-typing/). Only applies to FQL v10 queries. If omitted, uses the typechecking setting of the database. `--timeout ` Maximum query runtime, in milliseconds. `--performance-hints` Output [performance hints](../../../../../learn/query/performance-hints/). Sets `--include summary`. Only applies to FQL v10 queries. If no performance hints are returned, no hints are output. Use the [`.toggleInfo`](#togg-info) REPL command to enable or disable output of `--include` info in a session. Output `--color` Enable color formatting for output. Enabled by default. Use `--no-color` to disable. `--json` Output results as JSON. This flag doesn’t affect error output. `--quiet` Suppress all log messages except fatal errors. Output only command results. Overrides `--verbosity` and `--verbose-component`. Config `--config ` Path to a CLI [config file](../../#config) to use. If provided, must [specify a profile](../../#provide). `-p`, `--profile ` Profile from the CLI [config file](../../#config). A profile is a group of [CLI settings](../../#settings). Debug `--verbose-component ` Components to emit logs for. Overrides `--verbosity`. Accepts the following values: * `argv` * `config` * `creds` * `error` * `fetch` Pass values as a space-separated list. Example: `--verbose-component argv config`. `--verbosity ` Least critical log level to emit. Accepts integers ranging from `1` (fatal) to `5` (debug). Lower values represent more critical logs. Log messages with a level greater than this value are not logged. Options `-h`, `--help` Show help. `--include ` Include additional query response data in the output. Only applies to FQL v10 queries. Accepts the following values: * `all` * `none` * `txnTs` * `schemaVersion` * `summary` * `queryTags` * `stats` Defaults to only `summary`. Pass values as a space-separated list, such as `--include summary queryTags`. Each value corresponds to a field from the [query response](../../../../../reference/http/reference/core-api/#operation/query). Use the [`.toggleInfo`](#togg-info) REPL command to enable or disable output of `--include` info in a session. ## [](#examples)Examples ```cli # Run queries in the 'us/my_db' database. fauna shell \ --database us/my_db # Run queries in the 'us/my_db' database. # using the 'server' role. fauna shell \ --database us/my_db \ --role server # Run queries in the database scoped to a secret. fauna shell \ --secret my-secret ``` ## [](#repl-cmd)REPL commands REPL sessions support the following commands: `.break` Sometimes you get stuck. This gets you out. `.clear` Clear the REPL. `.clearhistory` Clear the REPL session history in `~/.fauna/history`. `.editor` Enter editor mode for multi-line queries. `.exit` Exit the REPL. `.help` Get a list of supported REPL commands. `.lastError` Display the most recent error encountered in the REPL. `.load` Load FQL queries from a file into the REPL session. `.save` Save all evaluated FQL queries from the REPL session to a file. `.toggleInfo` Enable or disable additional response data in results. If enabled, outputs fields listed in [`--include`](#include). Enabled by default. `.togglePerformanceHints` Enable or disable [performance hints](../../../../../learn/query/performance-hints/). Only applies to FQL v10 queries. Disabled by default. If no performance hint is returned, no hint is included in the output. # Fauna CLI v3 | Version: 3.0.1 | Package: fauna-shell | | --- | --- | --- | --- | The Fauna CLI lets you access Fauna from your terminal. You can use the CLI to: * Log in to your Fauna account * Create and manage Fauna databases and keys * Push, pull, and manage FSL schema * Run FQL queries in an interactive shell ## [](#requirements)Requirements * Node.js v20.x or later * A Fauna account. ## [](#installation)Installation To install the Fauna CLI globally: ```bash npm install -g fauna-shell@3.0.1 ``` ## [](#login)Log in to Fauna Use [`fauna cloud-login`](commands/cloud-login/) to log in to Fauna: ```bash fauna cloud-login ``` When prompted, enter: * **Endpoint name:** `cloud` (Press Enter) An endpoint defines the settings the CLI uses to run API requests against a Fauna account or database. See [Endpoints](#endpoints). * **Email address:** The email address for your Fauna account. * **Password:** The password for your Fauna account. * **Which endpoint would you like to set as default?** The `cloud-*` endpoint for your preferred region group. For example, to use the US region group, use `cloud-us`. [`fauna cloud-login`](commands/cloud-login/) requires an email and password login. If you log in to Fauna using GitHub or Netlify, you can enable email and password login using the [Forgot Password](https://dashboard.fauna.com/forgot-password) workflow. If successful, the command adds a related endpoint and secret to the `.fauna-shell` configuration file. See [Configuration](#config). ## [](#config)Configuration Upon [login](#login), the CLI creates or updates a `.fauna-shell` configuration file. The file is located at: * Linux, macOS, Unix: `~/.fauna-shell` * Windows: `%userprofile%\.fauna-shell` `.fauna-shell` is an [INI-format file](https://en.wikipedia.org/wiki/INI_file) that stores the configuration for Fauna [endpoints](#endpoints). Example: ```ini default=cloud-us [endpoint.cloud-us] domain=db.fauna.com scheme=https secret=fn... [endpoint.cloud-eu] domain=db.fauna.com scheme=https secret=fn... [localhost] domain=127.0.0.1 port=8443 scheme=http secret=fn... ``` An endpoint starts with `[]` followed by its properties. If an endpoint or property is duplicated, the CLI uses the last definition. ### [](#endpoints)Endpoints Internally, the CLI uses the [Fauna Core HTTP API](../../reference/http/reference/core-api/) to execute most commands. An endpoint defines the settings the CLI uses to run API requests against a Fauna account or database. Each endpoint contains: * A base `domain` for Fauna Core HTTP API endpoints. * An HTTP `scheme` for the base domain. * An [authentication `secret`](../../learn/security/authentication/) used to authenticate and route Fauna API requests. The secret is scoped to a specific database or a Fauna account’s top-level context. Endpoints let you switch between different Fauna accounts or databases using the CLI. #### [](#add-endpoints)Add endpoints The CLI stores endpoints in [`.fauna-shell`](#config). The [`cloud-login`](#login) command is the preferred way to add endpoints to `.fauna-shell`. Endpoints for a Fauna account or database should use: * A `domain` of `db.fauna.com` * An HTTP scheme of `https` Example: ```ini [endpoint.cloud-us] domain=db.fauna.com scheme=https secret=fn... ``` #### [](#non-standard-endpoints)Non-standard endpoints If you use a local [Fauna container](../tools/docker/), you can use [`fauna endpoint add`](commands/topics/endpoint/endpoint-add/) to add non-standard or local endpoints to `.fauna-shell`. Example: ```ini [localhost] domain=127.0.0.1 port=8443 scheme=http secret=fn... ``` ### [](#config-global-properties)Global `.fauna-shell` properties The `.fauna-shell` configuration file has the following global properties: | Property | Required | Description | | --- | --- | --- | --- | --- | | default= | | Name for the default endpoint used in Fauna CLI commands.You can override the default for a command using the --endpoint option.If no default endpoint is defined and a command doesn’t include the --endpoint option, the CLI returns an error. | ### [](#config-endpoint-properties)`.fauna-shell` endpoint properties Endpoints in the `.fauna-shell` configuration file have the following properties: | Property | Required | Description | | --- | --- | --- | --- | --- | | secret= | true | Secret used to authenticate HTTP API requests to the endpoint. | | domain= | | Hostname of the endpoint’s Fauna instance. Defaults to db.fauna.com. | | scheme= | | Connection scheme. Must be https (default) or http. | | port= | | UNIX port number of the endpoint’s Fauna instance. Defaults to 443. | | queriesFile= | | Default file containing FQL queries to run using the fauna eval command. You can override the default using the command’s --file option. | To differentiate between endpoints, you can also include arbitrary properties. Fauna ignores these properties. ## [](#basic-usage)Basic usage This section covers common Fauna CLI commands and usage. For all commands, see [Fauna CLI commands](commands/). ### [](#initialize-project)Initialize a project directory A project directory includes: ``` / // Directory containing app source code (optional) ├── .fauna-project // INI-format file containig Fauna CLI defaults for the project ├── schema/ // Directory containing Fauna .fsl schema files │ └── *.fsl ... ``` * A `.fauna-project` file that stores a default configuration for the project in Fauna CLI * `.fsl` files for the project’s database(s), typically stored in a subdirectory * (Optional) The application’s source code Use [`fauna project init`](commands/topics/project/init/) to create a `.fauna-project` file for a project directory: ```bash fauna project init ``` When prompted, provide: * A schema directory used to store `.fsl` files. If the directory doesn’t exist, the command creates it. * A default environment name. See [Environments](#environments). * A default endpoint to use for Fauna CLI commands. * A default database for Fauna CLI commands. For more information about the `.fauna-project` file, see [Project configuration](#proj-config). ### [](#create-database)Create a database Use [`fauna create-database`](commands/create-database/) to create a database: ```bash fauna create-database ``` If you’re using a [`.fauna-project`](#proj-config) file and want to create a top-level database, add `--environment=''`: ```bash fauna create-database --environment='' ``` To create a top-level database, you must use a secret scoped to the account’s top-level context. To create this secret and use it by default, use the [`fauna cloud-login`](#login) command. ### [](#manage-fsl-schema)Manage FSL schema In Fauna, you define database schema using Fauna Schema Language (FSL). You can manage FSL schemas using the [Fauna Dashboard](https://dashboard.fauna.com/) or as `.fsl` files using the Fauna CLI or the Fauna Core HTTP API’s [Schema endpoints](../../reference/http/reference/core-api/#tag/Schema). Using `.fsl` files lets you: * Store `.fsl` schema files alongside your application code * Pull and push schema to your Fauna database from a local directory * Place database schema under version control * Deploy schema with [CI/CD pipelines](../../learn/schema/manage-schema/#cicd) * Change your production schema as your app evolves using [progressive schema enforcement](../../learn/schema/#type-enforcement) and [zero-downtime migrations](../../learn/schema/#schema-migrations) For more information, see [Manage schema as `.fsl` files](../../learn/schema/manage-schema/). ### [](#create-key)Create a key Use [`fauna create-key`](commands/create-key/) to create a [key](../../learn/security/keys/) for a database: ```bash fauna create-key ``` If you’re using a [`.fauna-project`](#proj-config) file and want to create a key for a top-level database, add `--environment=''`: ```bash fauna create-key --environment='' ``` To create a key for a top-level database, you must use a secret scoped to the account’s top-level context. You can create this secret and use it by default using the [`fauna cloud-login`](#login) command. The response includes the key’s secret. The secret is shown once. You can’t recover or retrieve the secret later. If you don’t specify a role, the key uses the `admin` role by default. ### [](#run-fql-queries)Run FQL queries The Fauna CLI includes commands for running FQL queries. #### [](#run-queries-using-eval)Run queries using `eval` Use [`fauna eval`](commands/eval/) to run an FQL query from the command line, a file, or STDIN. ```bash fauna eval "Product.all()" ``` For additional examples, see the [`fauna eval`](commands/eval/) command reference docs. #### [](#run-queries-in-an-interactive-shell)Run queries in an interactive shell Use [`fauna shell`](commands/shell/) to start an interactive shell session in the Fauna CLI. You can use the session to run arbitrary FQL queries. ```bash fauna shell ``` In the shell session, you can enter editor mode to run multi-line queries: ```bash > .editor ``` ## [](#proj-config)Project configuration `.fauna-project` is an [INI-format file](https://en.wikipedia.org/wiki/INI_file) that stores a default Fauna CLI configuration for a project directory. The Fauna CLI uses these defaults when you run commands in the directory. If you run commands in a subdirectory, the CLI searches parent directories for the nearest `.fauna-project` file. Example: ```ini schema_directory=schema default=dev [environment.dev] endpoint=fauna-us database=accounts/dev [environment.qa] endpoint=fauna-us database=accounts/qa [environment.prod] endpoint=fauna-us database=accounts/prod ``` ### [](#environments)Environments The `.fauna-project` file lets you define multiple environments for a project. An environment groups a Fauna endpoint with a default database at the endpoint. Fauna CLI environments are typically mapped to the environments for the client application, such as `dev`, `staging`, or `prod`. You can use Fauna environments to easily switch between databases when running Fauna CLI commands. An environment starts with `[environment.]` followed by its configuration properties. If an environment or property is duplicated, the CLI uses the last definition. Several Fauna CLI commands, such as [`fauna eval`](commands/eval/), let you easily switch environments using the `--environment` option: ```bash fauna eval "Product.all()" --environment='prod' ``` ### [](#global-properties)Global properties | Property | Required | Description | | --- | --- | --- | --- | --- | | schema_directory= | | Default directory of .fsl files used for the following commands:fauna schema abandonfauna schema commitfauna schema difffauna schema pullfauna schema pushfauna schema statusYou can override the default for these commands using the --dir option.If no default endpoint is defined and the command doesn’t include the --dir option, the CLI returns an error. | | default= | | Default environment used for Fauna CLI commands. | ### [](#environment-properties)Environment properties | Property | Required | Description | | --- | --- | --- | --- | --- | | endpoint= | | Default endpoint for the environment. The endpoint must be defined in the ~/.fauna-shell configuration file. See .fauna-shell endpoint properties in the configuration documentation. | | database= | | Default database for the environment.Can include a path to a child database. Example: accounts/prod is a path to the accounts database’s prod child database. | ## [](#migrate)Migrate to v4 of the Fauna CLI [v3](./) of the Fauna CLI is now deprecated. [v4](v4/) of the Fauna CLI introduces several significant enhancements to the developer experience. The following table outlines major changes from v3 to v4 of the CLI. The table is not exhaustive. | Topic | Changes in v4 | | --- | --- | --- | --- | | Requirements | v4 of the CLI requires Node.js v20.18 or later. v22 or later is recommended. | | Authentication | fauna login replaces fauna cloud-login. The new fauna login command uses a web-based flow. You can use --user to switch between credentials for different users after login.The .fauna-shell file is no longer used and can be safely deleted. Instead, the CLI uses short-lived credentials and refreshes credentials as needed. See How interactive login works.You can use the FAUNA_ACCOUNT_KEY or FAUNA_SECRET environment variables to authenticate commands programmatically. | | Configuration | The .fauna-shell file is no longer used and can be safely deleted..fauna-project files are no longer used and can be safely deleted.The environment and endpoint abstractions are no longer used. Instead, you can use profiles in a config file to switch between settings for different environments or use cases. | | Local development and scripting | v4 introduces fauna local, which starts a local Fauna container.In v4, you can use --local to run CLI commands in a local Fauna container.v4 supports customized logging and scripting. | | Removed and updated commands | The following database commands have been renamed and updated:fauna create-database is now fauna database create.fauna list-databases is now fauna database list.fauna delete-database is now fauna database delete.The fauna eval command is now fauna query, which supports a fauna eval alias.v4 has removed commands for managing keys. Instead, you can manage keys in the CLI with FQL queries using Key methods in fauna eval or fauna shell.Commands related to managing environments and endpoints have been removed. Instead, you can use profiles in a config file to switch between settings for different environments or use cases.fauna import has been removed. | | Schema management | fauna schema commands no longer require fauna project init. Otherwise, fauna schema commands remain largely unchanged..fauna-project files are no longer used and can be safely deleted. Instead, you can use profiles in a config file to easily switch between databases and other command settings. | | Auto-complete | You can use the fauna completion command to enable auto-complete for CLI v4 commands in bash or zsh. See Auto-complete. | | CLI updates and issue tracking | Starting with v4 of the CLI, Fauna uses the fauna-shell GitHub repository for issue tracking and release tracking.v4 of the CLI uses the update-notifier library to notify you when updates to the CLI’s npm package is available. | # Fauna CLI commands | fauna add-endpoint | Adds an endpoint to query databases. | | --- | --- | --- | --- | | fauna cloud-login | Adds a Fauna endpoint with login credentials. | | fauna create-database | Creates a database. | | fauna create-key | Creates a key to access a database. | | fauna default-endpoint | Selects an endpoint configuration entry as the default endpoint. | | fauna delete-database | Deletes a database. | | fauna delete-endpoint | Deletes an endpoint entry from the configuration file. | | fauna delete-key | Deletes a database key. | | fauna endpoint add | Adds an endpoint to .fauna-shell. | | fauna endpoint list | Lists endpoints in .fauna-shell. | | fauna endpoint remove | Removes an endpoint from .fauna-shell. | | fauna endpoint select | Sets the default endpoint. | | fauna environment add | Adds an environment to the .fauna-project file. | | fauna environment list | Lists environments available in .fauna-project file. | | fauna environment select | Updates the default environment in .fauna-project file. | | fauna eval | Runs an FQL query. | | fauna help | Displays the Fauna CLI help. | | fauna import | Imports JSON files, CSV files, or directories into a collection. | | fauna list-databases | Lists child databases. | | fauna list-endpoints | Lists connection endpoints. | | fauna list-keys | Lists keys in the current database. | | fauna project init | Initializes a project directory by creating a .fauna-project file. | | fauna run-queries | Runs the queries from a file. | | fauna schema abandon | Abandons a staged schema change. | | fauna schema commit | Applies a staged schema change to the database. | | fauna schema diff | Prints the diff between local, staged, or active schema. | | fauna schema pull | Pulls a database’s remote .fsl schema files into a local schema directory. | | fauna schema push | Pushes a local directory of .fsl schema files to Fauna. By default, stages a schema change. | | fauna schema status | Prints the build status of a staged schema change. | | fauna shell | Starts an interactive Fauna session to run queries. | # `fauna add-endpoint` | Learn: Endpoints | | --- | --- | --- | Adds an [endpoint](../../#endpoints) to the [`.fauna-shell`](../../#config) configuration file. This command is deprecated. Use [`fauna endpoint add`](../topics/endpoint/endpoint-add/) instead. ## [](#syntax)Syntax fauna add-endpoint \[-y | --no-input\] \[--secret \]    \[--set-default\] \[--url \] ## [](#description)Description The `add-endpoint` command adds an [endpoint](../../#endpoints) to the [`.fauna-shell`](../../#config) configuration file. If you don’t provide a URL or secret, you’re prompted for them, and they are written to the configuration file. [`fauna cloud-login`](../cloud-login/) is the preferred way to add an endpoint. Use [`fauna endpoint add`](../topics/endpoint/endpoint-add/) to add a non-standard endpoint, such as when using the [Fauna Dev Docker image](../../../tools/docker/). ## [](#arguments)Arguments | Argument | Description | | --- | --- | --- | --- | | | Endpoint name. | ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | -y, --no-input | Disables interaction | | --secret | Authentication secret. Overrides the secret in .fauna-shell.Use a scoped key to interact with a child database using a parent database’s admin key.For example, with a parent database’s admin key secret of fn123, you can access a child database by appending the child database name and role: fn123:childDB:admin. | | --set-default | Sets this environment as the default | | --url | Database URL | ## [](#aliases)Aliases ```bash fauna add-endpoint ``` ## [](#examples)Examples ```bash fauna endpoint add ``` ```bash fauna endpoint add localhost --url http://localhost:8443/ --key secret ``` ```bash fauna endpoint add localhost --set-default ``` ## [](#see-also)See also [`fauna cloud-login`](../cloud-login/) # `fauna cloud-login` Log in to Fauna. ## [](#syntax)Syntax fauna cloud-login \[-- help\] ## [](#description)Description The `cloud-login` command logs in to Fauna. If successful, the command adds a related [endpoint](../../#endpoints), including an [authentication secret](../../../../learn/security/authentication/), to the [`.fauna-shell`](../../#config) configuration file. See [Configuration](../../#config). `cloud-login` is the preferred way to add an endpoint. Use [`fauna endpoint add`](../topics/endpoint/endpoint-add/) to add a non-standard endpoint, such as when using the [Fauna Dev Docker image](../../../tools/docker/). ### [](#github-or-netlify-login)GitHub or Netlify login [`fauna cloud-login`](./) requires an email and password login. If you log in to Fauna using GitHub or Netlify, you can enable email and password login using the [Forgot Password](https://dashboard.fauna.com/forgot-password) workflow. ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | --help | Help for cloud-login command. | ## [](#examples)Examples ```bash fauna cloud-login ``` ## [](#see-also)See also [`Configuration`](../../#config) [`fauna endpoint add`](../topics/endpoint/endpoint-add/) # `fauna create-database` Creates a database. ## [](#syntax)Syntax fauna create-database \[--\[no-\]color \] \[--endpoint \]   \[--environment \] \[--secret \] \[--timeout \]   \[--url \] ## [](#description)Description The `create-database` command creates a database with the provided database name. If command line options are omitted, Fauna uses the default [configuration file](../../#config) options. ### [](#create-a-top-level-database)Create a top-level database To create a top-level database using `create-database`, you must use an [authentication secret](../../../../learn/security/authentication/#secrets) scoped to the account’s top-level context. You can create a top-level secret using the [`fauna cloud-login`](../cloud-login/) command. ### [](#create-a-child-database)Create a child database To create a child database using `create-database`, you must use an [authentication secret](../../../../learn/security/authentication/#secrets) scoped to the parent database. ## [](#arguments)Arguments | Argument | Description | | --- | --- | --- | --- | | | Database name. | ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | --[no-]color | Enables or disables color formatting for the output. Color formatting is enabled by default if the terminal supports it (determined using chalk/supports-color). Use --no-color to disable. | | --endpoint | Connection endpoint from .fauna-shell. | | --environment | Environment to use, from a Fauna project. | | --help | Help for create-database command. | | --secret | Authentication secret. Overrides the secret in .fauna-shell.Use a scoped key to interact with a child database using a parent database’s admin key.For example, with a parent database’s admin key secret of fn123, you can access a child database by appending the child database name and role: fn123:childDB:admin. | | --timeout | Connection timeout (milliseconds). | | --url | Database URL. Overrides the URL in .fauna-shell. | ## [](#examples)Examples ### [](#create-a-top-level-database-2)Create a top-level database To create a top-level database named `ECommerce`: ```bash fauna create-database ECommerce ``` If you’re using a [`.fauna-project`](../../#proj-config) file and want to create a top-level database, add `--environment=''`: ```bash fauna create-database --environment='' ``` To create a top-level database, you must use a secret scoped to the account’s top-level context. To create this secret and use it by default, use the [`fauna cloud-login`](../cloud-login/) command. ### [](#create-a-child-database-2)Create a child database To create a child database, you must use a secret scoped to the parent database. You can pass a secret using `--secret`. The following command creates a child database named `childDB`: ```bash fauna create-database --secret='fn...' childDB ``` ## [](#see-also)See also [`fauna list-databases`](../list-databases/) [`fauna delete-database`](../delete-database/) # `fauna create-key` Create a key to access a database. ## [](#syntax)Syntax fauna create-key \[\] \[--\[no-\]color \]   \[--endpoint \] \[--environment \]   \[--secret \] \[--timeout \]   \[--url \] ## [](#description)Description The `create-key` command creates a key that allows access to the `DBNAME` database. The key is assigned one of the following built-in roles: * `admin` (default) * `server` * `server-readonly` You can’t use this command to create a key in a parent or peer database. To access a database outside of the current database, log in to the [Fauna Dashboard](https://dashboard.fauna.com/). ## [](#arguments)Arguments | Argument | Description | | --- | --- | --- | --- | | | Name of the database to create a key for. | | | Role. Must be one of:admin (default)serverserver-readonly | ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | --[no-]color | Enables or disables color formatting for the output. Color formatting is enabled by default if the terminal supports it (determined using chalk/supports-color). Use --no-color to disable. | | --endpoint | Connection endpoint from .fauna-shell. | | --environment | Environment to use, from a Fauna project. | | --help | Help for create-database command. | | --secret | Authentication secret. Overrides the secret in .fauna-shell.Use a scoped key to interact with a child database using a parent database’s admin key.For example, with a parent database’s admin key secret of fn123, you can access a child database by appending the child database name and role: fn123:childDB:admin. | | --timeout | Connection timeout (milliseconds). | | --url | Database URL. Overrides the URL in .fauna-shell. | ## [](#example)Example ### [](#create-a-key-for-a-top-level-database)Create a key for a top-level database To create a key for a top-level database, pass an empty string to the `--environment` option. The following command creates a key with the `server` role for the top-level `ECommerce` database: ```bash fauna create-key --environment='' ECommerce server ``` To create a key for a top-level database, you must use a secret scoped to the account’s top-level context. You can create this secret and use it by default using the [`fauna cloud-login`](../cloud-login/) command. ### [](#create-a-key-for-a-child-database)Create a key for a child database To create a key for a child database, you must use a secret scoped to the parent database. You can pass a secret using the `--secret` option. The following command creates a key with the `server-only` role for the `childDB` child database: ```bash fauna create-key --secret='fn...' childDB server-readonly ``` ## [](#see-also)See also [`fauna list-keys`](../list-keys/) [`fauna delete-key`](../delete-key/) [Configuration](../../#config) # `fauna default-endpoint` | Learn: Endpoints | | --- | --- | --- | Sets an [endpoint](../../#endpoints) from the [`.fauna-shell`](../../#config) configuration file as the default endpoint. The command updates the [`default`](../../#global-properties) property of .fauna-shell\`. This command is deprecated. Use [`fauna endpoint select`](../topics/endpoint/endpoint-select/) instead. ## [](#syntax)Syntax fauna default-endpoint ## [](#description)Description Set an endpoint as the default one. ## [](#arguments)Arguments | Argument | Description | | --- | --- | --- | --- | | | New default endpoint | ## [](#aliases)Aliases ```bash fauna default-endpoint ``` ## [](#examples)Examples ```bash fauna endpoint select ``` ```bash fauna endpoint select endpoint ``` # `fauna delete-database` Deletes a database. ## [](#syntax)Syntax fauna delete-database \[--\[no-\]color \] \[--endpoint \]   \[--environment \] \[--secret \] \[--timeout \]   \[--url \] ## [](#description)Description The `delete-database` command deletes a database. ### [](#delete-a-top-level-database)Delete a top-level database To delete a top-level database using `delete-database`, you must use an [authentication secret](../../../../learn/security/authentication/#secrets) scoped to the account’s top-level context. You can create a top-level secret using the [`fauna cloud-login`](../cloud-login/) command. ### [](#delete-a-child-database)Delete a child database To delete a child database using `delete-database`, you must use an [authentication secret](../../../../learn/security/authentication/#secrets) scoped to the parent database. ### [](#considerations)Considerations When you delete a database, its data becomes inaccessible and is asynchronously deleted. As part of the deletion process, Fauna recursively deletes: * Any keys scoped to the database. * The database’s child databases, including any nested databases. Deleting a database with a large number of keys can exceed Transactional Write Ops throughput limits. This can cause [throttling errors](../../../../reference/http/reference/errors/#rate-limits) with a `limit_exceeded` [error code](../../../../reference/http/reference/errors/#error-codes) and a 429 HTTP status code. Deleting a database with a large number of child databases can cause timeout errors with a `time_out` [error code](../../../../reference/http/reference/errors/#error-codes) and a 440 HTTP status code. To avoid throttling or timeouts, incrementally delete all keys and child databases before deleting the database. See [delete all keys](../../../../reference/fql-api/key/delete/#delete-all-keys) and [delete all child databases](../../../../reference/fql-api/database/delete/#delete-all-dbs). ## [](#arguments)Arguments | Argument | Description | | --- | --- | --- | --- | | | The name of the database to delete. | ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | --[no-]color | Enables or disables color formatting for the output. Color formatting is enabled by default if the terminal supports it (determined using chalk/supports-color). Use --no-color to disable. | | --endpoint | Connection endpoint from .fauna-shell. | | --environment | Environment to use, from a Fauna project. | | --help | Help for create-database command. | | --secret | Authentication secret. Overrides the secret in .fauna-shell.Use a scoped key to interact with a child database using a parent database’s admin key.For example, with a parent database’s admin key secret of fn123, you can access a child database by appending the child database name and role: fn123:childDB:admin. | | --timeout | Connection timeout (milliseconds). | | --url | Database URL. Overrides the URL in .fauna-shell. | ## [](#examples)Examples Delete the `childDB` database: ```bash fauna delete-database childDB ``` ## [](#see-also)See also [`fauna create-database`](../create-database/) [`fauna list-databases`](../list-databases/) # `fauna delete-endpoint` | Learn: Endpoints | | --- | --- | --- | Removes an [endpoint](../../#endpoints) from the [`.fauna-shell`](../../#config) configuration file. This command is deprecated. Use [`fauna endpoint remove`](../topics/endpoint/endpoint-remove/) instead. ## [](#syntax)Syntax fauna delete-endpoint ## [](#arguments)Arguments | Argument | Description | | --- | --- | --- | --- | | | Endpoint name. | ## [](#options)Options None ## [](#examples)Examples ```bash fauna delete-endpoint my_endpoint ``` # `fauna delete-key` Delete a database key. ## [](#syntax)Syntax fauna delete-key \[--\[no-\]color \] \[--endpoint \]   \[--environment \] \[--secret \] \[--timeout \]   \[--url \] ## [](#description)Description The `delete-key` command deletes a key. ## [](#arguments)Arguments | Argument | Description | | --- | --- | --- | --- | | | Name of the key to delete. | ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | --[no-]color | Enables or disables color formatting for the output. Color formatting is enabled by default if the terminal supports it (determined using chalk/supports-color). Use --no-color to disable. | | --endpoint | Connection endpoint from .fauna-shell. | | --environment | Environment to use, from a Fauna project. | | --help | Help for create-database command. | | --secret | Authentication secret. Overrides the secret in .fauna-shell.Use a scoped key to interact with a child database using a parent database’s admin key.For example, with a parent database’s admin key secret of fn123, you can access a child database by appending the child database name and role: fn123:childDB:admin. | | --timeout | Connection timeout (milliseconds). | | --url | Database URL. Overrides the URL in .fauna-shell. | ## [](#example)Example There are already four keys for this example: ```bash fauna list-keys ``` The response includes each key’s ID, database, and role. It doesn’t include the key’s secret: ``` Key ID Database Role 259718958404338186 prod server 259719743570706945 prod client 265528117038154259 childDB admin 265437820880945683 childDB admin ``` Now, delete the first key in the list: ```bash fauna delete-key 259718958404338186 ``` List the keys again: ```bash fauna list-keys ``` The key you deleted is now gone: ``` Key ID Database Role 259719743570706945 prod client 265528117038154259 childDB admin 265437820880945683 childDB admin ``` ## [](#see-also)See also [`fauna list-keys`](../list-keys/) [`fauna create-key`](../create-key/) # `fauna endpoint add` | Learn: Endpoints | | --- | --- | --- | Adds an [endpoint](../../../../#endpoints) to the [`.fauna-shell`](../../../../#config) configuration file. ## [](#syntax)Syntax fauna endpoint add \[--help\] \[-y | --no-input\]   \[--secret \] \[--set-default\] \[--url \] ## [](#description)Description The `add-endpoint` command adds an [endpoint](../../../../#endpoints) to the [`.fauna-shell`](../../../../#config) configuration file. If you don’t provide a URL or secret, you’re prompted for them, and they are written to the configuration file. [`fauna cloud-login`](../../../cloud-login/) is the preferred way to add an endpoint. Use `endpoint add` to add a non-standard endpoint, such as when using the [Fauna Dev Docker image](../../../../../tools/docker/). ## [](#arguments)Arguments | Argument | Description | | --- | --- | --- | --- | | | Endpoint name.The Fauna CLI identifies the URL scheme, domain, and port and includes those values in the new endpoint entry that it creates in the configuration file. | ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | -y, --no-input | Disable interaction. | | --help | Help for endpoint add command. | | --secret | Authentication secret. Overrides the secret in .fauna-shell.Use a scoped key to interact with a child database using a parent database’s admin key.For example, with a parent database’s admin key secret of fn123, you can access a child database by appending the child database name and role: fn123:childDB:admin. | | --set-default | Sets this environment as the default environment. | | --url | Database URL. | ## [](#aliases)Aliases ```bash fauna add-endpoint ``` ## [](#examples)Examples ```bash fauna endpoint add ``` ```bash fauna endpoint add localhost --url http://localhost:8443/ --key secret ``` ```bash fauna endpoint add localhost --set-default ``` ## [](#see-also)See also [`fauna cloud-login`](../../../cloud-login/) # `fauna endpoint list` | Learn: Endpoints | | --- | --- | --- | Lists [endpoints](../../../../#endpoints) from the [`.fauna-shell`](../../../../#config) configuration file. ## [](#syntax)Syntax fauna endpoint list \[--help\] ## [](#description)Description Lists [endpoints](../../../../#endpoints) from the [`.fauna-shell`](../../../../#config) configuration file. ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | --help | Help for endpoint remove command. | ## [](#aliases)Aliases ```bash fauna list-endpoints ``` ## [](#examples)Examples ```bash fauna endpoint list ``` # `fauna environment add` Add an environment to the `.fauna-project` file. ## [](#syntax)Syntax fauna environment add \[-y | --no-input\] \[--database \]   \[--endpoint \] \[--help\] \[--name \] \[--set-default\] ## [](#description)Description Add a new environment to `.fauna-project`. ## [](#arguments)Arguments None ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | -y, --no-input | Disable interaction. | | --name | Environment name. | | --endpoint | Endpoint to use in this environment. | | --database | Database path to use in this environment. | | --set-default | Set this environment as the default. | | --help | Help for environment add command. | ## [](#examples)Examples ```bash fauna environment add ``` ```bash fauna environment add --name my-app --endpoint dev --database my-database ``` ```bash fauna environment add --name my-app --endpoint dev --database my-database --set-default ``` # `fauna environment list` List environments available in `.fauna-project` file. ## [](#syntax)Syntax fauna environment list \[--help\] ## [](#description)Description List environments available in `.fauna-project`. ## [](#arguments)Arguments None ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | --help | Help for environment list command. | ## [](#examples)Examples ```bash fauna environment list ``` # `fauna eval` Run an FQL query. ## [](#syntax)Syntax fauna eval \[\] \[--\[no-\]color \] \[--endpoint \]   \[--environment \] \[--file \]   \[--format json|json-tagged|shell\] \[--output \]   \[--secret \] \[--stdin\] \[--timeout \]   \[--typecheck\] \[--url \] \[--version 4|10\] ## [](#description)Description The `eval` command runs the _QUERY_ against the optional _DBNAME_ database. The query is executed in the database. If you include a _DBNAME_, it must be the first argument. The QUERY can be read from STDIN, a file, or the command line, and query results can be output to STDOUT or a file. You can also define the output format. If the query returns an error, the Fauna CLI exits with a non-zero exit code. By default, this command supports FQL v10 queries. For FQL v4 queries, use the `--version 4` option. You can’t use this command to execute a query in a parent or peer database. To access a database outside of the current database, log in to the [Fauna Dashboard](https://dashboard.fauna.com/). ## [](#arguments)Arguments | Argument | Description | | --- | --- | --- | --- | | | Name of the database the query should be run against. | | | Query you want to run. | ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | --[no-]color | Enables or disables color formatting for the output. Color formatting is enabled by default if the terminal supports it (determined using chalk/supports-color). Use --no-color to disable. | | --endpoint | Connection endpoint from .fauna-shell. | | --environment | Environment to use, from a Fauna project. | | --file | Name of file with queries to run. | | --format | Output format:     json     json-tagged     shell | | --help | Help for run-queries command. | | --output | File to write output to. | | --secret | Authentication secret. Overrides the secret in .fauna-shell.Use a scoped key to interact with a child database using a parent database’s admin key.For example, with a parent database’s admin key secret of fn123, you can access a child database by appending the child database name and role: fn123:childDB:admin. | | --stdin | Read file input from stdin. Write to stdout by default. | | --timeout | Connection timeout (milliseconds). | | --typecheck | Enable typechecking. | | --url | Database URL. Overrides the URL in .fauna-shell. | | --version | FQL Version:     4 = FQL version 4     10 = (default) FQL version 10See FQL v4 access. | ## [](#examples)Examples The following examples illustrate the many ways to use the `eval` command. ### [](#query-argument)_QUERY_ argument ```bash fauna eval "Product.all()" ``` Response: ``` { data: [ { id: "111", coll: Product, ts: Time("2099-07-30T22:55:21.670Z"), name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698, stock: 100, category: Category("123") }, ... ] } ``` ### [](#query-file)_QUERY_ file The query in the file is identical to the previous example: ```bash fauna eval --file=./query.fql ``` ### [](#stdin-query)STDIN query ```bash echo "Product.all()" | fauna eval --stdin ``` ### [](#database-query)Database query ```bash fauna eval childDB "Product.all()" ``` If the database doesn’t exist, a query of this type returns an error: ```bash fauna eval noChildDB "Product.all()" ``` ### [](#output-to-a-file)Output to a file ```bash fauna eval Product.all() --output=./output.json ``` Format the output: ```bash fauna eval Product.all() --format=shell --output=./output.json ``` # `fauna help` Display the Fauna CLI help. ## [](#syntax)Syntax fauna help \[\] fauna --help ## [](#description)Description The `help` command displays version, usage, and a list of supported Fauna CLI topics and commands. Include the _COMMAND_ argument to display the help for a given topic or command. ## [](#arguments)Arguments | Argument | Description | | --- | --- | --- | --- | | | Display help for a given topic or command. | ## [](#examples)Examples To get general help: ```bash fauna help ``` To get help for the `add-endpoint` command: ```bash fauna help add-endpoint ``` # `fauna import` Import JSON files, CSV files, or directories into a collection. ## [](#syntax)Syntax fauna import --path \[--allow-short-rows\] \[--append\]   \[--collection \] \[--\[no-\]color \]   \[--db \] \[--dry-run\] \[--endpoint \]   \[--environment \] \[--secret \]   \[--timeout \] \[--treat-empty-csv-cells-as empty|null\]   \[--type \] \[--url \] ## [](#description)Description The `import` command imports the contents of a JSON or CSV file, or a directory of JSON or CSV files into a Fauna collection. If you import a directory of source files, each file is imported into a separate collection. ## [](#import-considerations)Import considerations * JSON source files must be valid JSON. Each JSON object in the file is a document in the target collection. * CSV source files must be comma-delimited. Each line in the CSV file becomes a document in the target collection. A CSV file must have a header line with the field names for the target collection. * If your CSV file has rows with fewer fields than the number of fields in the header line, you can use the `--allow-short-rows` option to allow the import to continue. Otherwise, the import fails with an error. If you use the `--allow-short-rows` option, the documents imported from short rows don’t include the missing fields. * If the CSV file has empty columns, you can use the `treat-empty-csv-cells-as` option to choose: * `null`: (default) The field doesn’t exist in the imported document. * `empty`: The field exists in the imported document as an empty string. * The target collection can be an existing Fauna collection or a new one. If the target collection exists and isn’t empty, you must use the `--append` option. The new documents are added to the existing collection. If you don’t specify a target collection, the file name of the source file is used as the name of the target collection. * The `--path` option is the required path to the source file or directory of source files. * Floating-point numbers that end with `.0` are converted by JavaScript to integers. ## [](#document-references)Document references You can’t use the `import` command to import documents that contain [document references](../../../../learn/data-model/relationships/). Instead, you can use [`fauna eval`](../eval/) to import a `.fql` file that creates the documents. For example, you can run: ```bash fauna eval --file products.fql ``` Where `products.fql` contains: ```fql // Create an Array of objects that contain document data. let products = [ { id: 123, name: 'key limes', description: 'Conventional, 16 oz bag', price: 2_99, stock: 30, category: Category.byName('produce')!.first() }, { id: 456, name: 'peaches', description: 'Organic, 2 ct', price: 3_49, stock: 50, category: Category.byName('produce')!.first() } ] // Use `forEach()` to create a `Product` collection document for each // element of the previous Array. products.forEach(doc => Product.create({ doc })) // `forEach()` returns `null`. ``` ## [](#recommended-setup)Recommended setup To ensure data import integrity, each source file record should include a unique identifying field. You can create a unique index on that field to ensure that records are not imported multiple times and that you can query against the unique field to verify import completeness. ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | --allow-short-rows | Allow CSV files in which some rows have fewer fields than the number of fields in the header row. If the import finds a row that has more fields than the number of fields in the header row, the import fails with an error. | | --append | Append documents to an existing collection. | | --collection= | Create a collection to be created. Defaults to the file name of the source file. | | --[no-]color | Enables or disables color formatting for the output. Color formatting is enabled by default if the terminal supports it (determined using chalk/supports-color). Use --no-color to disable. | | --db= | Name the database in which to create a new target collection or append to an existing target collection. The parent database is the database associated with the secret you specify for the command. | | --dry-run | Do all import operations, including record processing and type conversions, except creating documents. This allows you to detect issues that might impact an import before writing documents to your collections. | | --endpoint | Connection endpoint from .fauna-shell. | | --environment | Environment to use, from a Fauna project. | | --help | Help for create-database command. | | --path= | Path to a source file or directory of source files.The directories can have only files, not subdirectories. | | --secret | Authentication secret. Overrides the secret in .fauna-shell.Use a scoped key to interact with a child database using a parent database’s admin key.For example, with a parent database’s admin key secret of fn123, you can access a child database by appending the child database name and role: fn123:childDB:admin. | | --timeout | Connection timeout (milliseconds). | | --treat-empty-csv-cells-as= | Define how empty fields in a record should be handled:       empty = Empty fields should occur in the imported document with an empty string.       null = (default) Empty fields shouldn’t occur in the imported document. | | --type=:: | Describe a data type. This feature is available only when importing CSV files. Type conversion is ignored for JSON and JSONL files. See the following CSV column types. | | --url | Database URL. Overrides the URL in .fauna-shell. | CSV column types: A `` is one of the following data types: | Type | Action | | --- | --- | --- | --- | | bool | Convert true, t, yes, and 1 to true. Convert all other values to false, except null, which remains null. | | number | Convert strings to numbers. | | dateString | Convert an ISO 8601 or RFC 2022 date string into a Time value. A best effort is made for other date string formats, and warnings are emitted when such date conversions are made. | | dateEpochMillis | Convert milliseconds since Unix epoch to a Time value. | | dateEpochSeconds | Convert seconds since Unix epoch to a Time value. | ## [](#examples)Examples These are some common examples of how to use the `import` command. ### [](#import-a-json-file)Import a JSON file A file named `zipcodes.json` has the following: ```json { "zipcode" : "01001", "city" : "AGAWAM", "pop" : 15338, "state" : "MA" } { "zipcode" : "01002", "city" : "CUSHMAN", "pop" : 36963, "state" : "MA" } { "zipcode" : "01005", "city" : "BARRE", "pop" : 4546, "state" : "MA" } { "zipcode" : "01007", "city" : "BELCHERTOWN", "pop" : 10579, "state" : "MA" } { "zipcode" : "01008", "city" : "BLANDFORD", "pop" : 1240, "state" : "MA" } ``` The following terminal command imports `zipcodes.json`: ```bash fauna import --path=./zipcodes.json ``` In the preceding command, no `--collection` option is given, so the Fauna CLI creates a new collection called `zipcodes`. ### [](#import-a-json-file-and-append-to-an-existing-collection)Import a JSON file and append to an existing collection A file named `zipcodes2.json` has the following: ```json { "zipcode" : "01010", "city" : "BRIMFIELD", "pop" : 3706, "state" : "MA" } { "zipcode" : "01011", "city" : "CHESTER", "pop" : 1688, "state" : "MA" } { "zipcode" : "01012", "city" : "CHESTERFIELD", "pop" : 177, "state" : "MA" } ``` The following terminal command imports `zipcodes2.json` and appends the documents to the existing collection `zipcodes`: ```bash fauna import --path=./zipcodes2.json --collection=zipcodes --append ``` ### [](#import-a-json-file-with-configuration-options)Import a JSON file with configuration options The following terminal command overrides any configuration options in the configuration file: ```bash fauna import --path=./zipcodes.json --endpointURL=https/db.us.fauna.com:8443 --secret=secret ``` ### [](#import-a-csv-file-with-type-conversions)Import a CSV file with type conversions The following CSV document has three string values: ```csv myDate,myBool,myNumber "May 3, 2021",true,15338 ``` To convert those string values to other types, you can use the `--type` option. ```bash fauna import --path=./myFile.csv --type=myDate::date --type=myBool::bool --type=myNumber::number ``` ### [](#import-a-directory-of-source-files)Import a directory of source files A directory named `source_files` has the following files: ```bash myJSONfile1.json myJSONfile2.json myCSVfile1.csv myCSVfile2.csv ``` The following command imports four files and creates four new collections: ```bash fauna import --path=./source_files ``` # `fauna list-databases` Lists databases. ## [](#syntax)Syntax fauna list-databases \[--\[no-\]color \] \[--endpoint \]   \[--environment \] \[--secret \]   \[--timeout \] \[--url \] ## [](#description)Description The `list-databases` command lists databases. ### [](#scope)Scope The `list-databases` command only lists direct child databases of the database for your authentication secret. You can’t use the `list-databases` to access parent, peer, or other descendant databases. If you use an authentication secret scoped to an account’s top-level context, `list-databases` lists the account’s top-level databases. You can create a top-level secret using the [`fauna cloud-login`](../cloud-login/) command. ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | --[no-]color | Enables or disables color formatting for the output. Color formatting is enabled by default if the terminal supports it (determined using chalk/supports-color). Use --no-color to disable. | | --endpoint | Connection endpoint from .fauna-shell. | | --environment | Environment to use, from a Fauna project. | | --help | Help for create-database command. | | --secret | Authentication secret. Overrides the secret in .fauna-shell.Use a scoped key to interact with a child database using a parent database’s admin key.For example, with a parent database’s admin key secret of fn123, you can access a child database by appending the child database name and role: fn123:childDB:admin. | | --timeout | Connection timeout (milliseconds). | | --url | Database URL. Overrides the URL in .fauna-shell. | ## [](#examples)Examples ```bash fauna list-databases ``` ## [](#see-also)See also [`fauna create-database`](../create-database/) [`fauna delete-database`](../delete-database/) # `fauna list-endpoints` | Learn: Endpoints | | --- | --- | --- | Lists [endpoints](../../#endpoints) from the [`.fauna-shell`](../../#config) configuration file. This command is deprecated. Use [`fauna endpoint list`](../topics/endpoint/endpoint-list/) instead. ## [](#syntax)Syntax fauna list-endpoints ## [](#description)Description List endpoints in [`.fauna-shell`](../../#config). ## [](#aliases)Aliases ```bash fauna endpoint list ``` ## [](#examples)Examples ```bash fauna list-endpoints ``` # `fauna list-keys` List keys in the current database. ## [](#syntax)Syntax fauna list-keys \[--\[no-\]color \] \[--endpoint \]   \[--environment \] \[--secret \]   \[--timeout \] \[--url \] ## [](#description)Description The `list-keys` command lists the keys created in the current database. ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | --[no-]color | Enables or disables color formatting for the output. Color formatting is enabled by default if the terminal supports it (determined using chalk/supports-color). Use --no-color to disable. | | --endpoint | Connection endpoint from .fauna-shell. | | --environment | Environment to use, from a Fauna project. | | --help | Help for create-database command. | | --secret | Authentication secret. Overrides the secret in .fauna-shell.Use a scoped key to interact with a child database using a parent database’s admin key.For example, with a parent database’s admin key secret of fn123, you can access a child database by appending the child database name and role: fn123:childDB:admin. | | --timeout | Connection timeout (milliseconds). | | --url | Database URL. Overrides the URL in .fauna-shell. | ## [](#example)Example For this example, assume that you have previously created some keys. Display the list with `fauna list-keys`: ```bash fauna list-keys ``` Response: ``` Key ID Database Role 373686120364376132 [current] admin 373711801788923969 [current] server 373714670256652356 prod server 374523090163466305 childDB admin ``` ## [](#see-also)See also [`fauna create-key`](../create-key/) [`fauna delete-key`](../delete-key/) # `fauna project init` Initialize a project directory by creating a [`.fauna-project`](../../../../#proj-config) file. ## [](#syntax)Syntax fauna project init \[\] ## [](#description)Description `fauna project init` initializes a project directory by creating a [`.fauna-project`](../../../../#proj-config) file. ```bash fauna project init ``` ### [](#prompts)Interactive prompts The `fauna project init` command requires input to interactive prompts. When running `fauna project init`, you’ll be prompted for: * Schema directory * Location to store your `.fsl` schema files * Default: Current director * Example: `schema` or `db/schema` * Default environment * Default [environment](../../../../#environments) for the project * Example: `dev`, `qa`, or `prod` * Defined in `.fauna-project` * Default endpoint * Default Fauna endpoint for the environment * Example: `fauna-us` or `fauna-eu` * Defined in [`.fauna-shell`](../../../../#config) * Default database * Default database for the environment * Example: `myapp/dev` or `myproject_development` ### [](#dir)Project directory A project directory includes: ``` / // Directory containing app source code (optional) ├── .fauna-project // INI-format file containing Fauna CLI defaults for the project ├── schema/ // Directory containing Fauna .fsl schema files │ └── *.fsl ... ``` * A `.fauna-project` file that stores a default configuration for the project in Fauna CLI * `.fsl` files for the project’s database(s), typically stored in a subdirectory * (Optional) The application’s source code For more about the `.fauna-project` file, see [Project configuration](../../../../#proj-config). ### [](#config)`.fauna-project` file `.fauna-project` is an [INI-format file](https://en.wikipedia.org/wiki/INI_file) that stores a default Fauna CLI configuration for a project directory. The Fauna CLI uses these defaults when you run commands in the directory. If you run commands in a subdirectory, the CLI searches parent directories for the nearest `.fauna-project` file. Example: ```ini schema_directory=schema default=dev [environment.dev] endpoint=fauna-us database=accounts/dev [environment.qa] endpoint=fauna-us database=accounts/qa [environment.prod] endpoint=fauna-us database=accounts/prod ``` Several Fauna CLI commands, such as [`fauna eval`](../../../eval/), let you easily switch environments using the `--environment` option: ```bash fauna eval "Product.all()" --environment='prod' ``` For more information about environments, see [environments](../../../../#environments). ### [](#set-up)Set up a project You can use `fauna project init` to set up a project directory and FSL files for an application. See [Set up a project using FSL and the Fauna CLI](../../../../../tutorials/project/). ## [](#arguments)Arguments | Argument | Description | | --- | --- | --- | --- | | | The directory to initialize as a Fauna project. Defaults to the current directory.If the directory doesn’t exist, the command creates it. | ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | --help | Help for project init command. | ## [](#examples)Examples ### [](#basic-example)Basic example ```bash fauna project init ``` When prompted, provide: * A schema directory used to store `.fsl` files. If the directory doesn’t exist, the command creates it. Press Enter to use the current directory. * A default environment name. See [environments](../../../../#environments). * A default endpoint to use for Fauna CLI commands. * A default database for Fauna CLI commands. ### [](#specify-a-project-directory)Specify a project directory To specify a project directory to create the [`.fauna-project`](../../../../#proj-config) file in: ```bash fauna project init path/to/some/other/dir ``` # `fauna run-queries` Run the queries from a file. ## [](#syntax)Syntax fauna run-queries \[\] \[\] \[--\[no-\]color \]   \[--file \] \[--endpoint \] \[--environment \]   \[--format json|json-tagged|shell\] \[--output \]   \[--secret \] \[--stdin\] \[--timeout \]   \[--typecheck\] \[--url \] \[--version 4|10\] ## [](#description)Description Run the queries provided in _file_. By default, this command supports FQL v10 queries. For FQL v4 queries, use the `--version 4` option. ## [](#arguments)Arguments | Argument | Description | | --- | --- | --- | --- | | | Database name. | | | FQL query to run. | ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | --[no-]color | Enables or disables color formatting for the output. Color formatting is enabled by default if the terminal supports it (determined using chalk/supports-color). Use --no-color to disable. | | --endpoint | Connection endpoint from .fauna-shell. | | --environment | Environment to use, from a Fauna project. | | --file | Name of file with queries to run. | | --format | Output format:     json     json-tagged     shell | | --help | Help for run-queries command. | | --output | File to write output to. | | --secret | Authentication secret. Overrides the secret in .fauna-shell.Use a scoped key to interact with a child database using a parent database’s admin key.For example, with a parent database’s admin key secret of fn123, you can access a child database by appending the child database name and role: fn123:childDB:admin. | | --stdin | Read file input from stdin. Write to stdout by default. | | --timeout | Connection timeout (milliseconds). | | --typecheck | Enable typechecking. | | --url | Database URL. Overrides the URL in .fauna-shell. | | --version | FQL Version:     4 = FQL version 4     10 = (default) FQL version 10See FQL v4 access. | ## [](#examples)Examples ```bash fauna run-queries dbname --file=/path/to/queries.fql ``` # `fauna schema abandon` Abandons a [staged schema change](../../../../../../learn/schema/manage-schema/#staged). ## [](#syntax)Syntax fauna schema abandon \[-y | --no-input\] \[--\[no-\]color \]   \[--dir \] \[--endpoint \] \[--environment \]   \[--help\] \[--secret \] \[--timeout \] \[--url \] ## [](#description)Description `fauna schema abandon` abandons a [staged schema change](#staged). You can abandon a staged schema change at any time, including a change with the `ready` status. This is useful when you want to discard changes that are no longer needed or failed during staging. ### [](#no-staged-schema-changes)No staged schema changes If a database has no staged schema, the command returns an error: ```bash › Error: There is no staged schema to abandon ``` To stage schema, use the [`fauna schema push`](../push/) command. See [Run a staged schema change](#staged). ## [](#arguments)Arguments None ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | -y, --no-input | Abandon the staged schema change without confirmation input. | | --[no-]color | Enables or disables color formatting for the output. Color formatting is enabled by default if the terminal supports it (determined using chalk/supports-color). Use --no-color to disable. | | --dir | A local directory of .fsl files. Defaults to the directory specified in .fauna-project. | | --endpoint | Fauna server endpoint. | | --environment | Environment to use from .fauna-project. | | --help | Help for the command. | | --secret | Authentication secret. Overrides the secret in .fauna-shell.Use a scoped key to interact with a child database using a parent database’s admin key.For example, with a parent database’s admin key secret of fn123, you can access a child database by appending the child database name and role: fn123:childDB:admin. | | --timeout | Connection timeout in milliseconds. | | --url | Database URL. Overrides the url in .fauna-shell. | ## [](#examples)Examples ### [](#basic-example)Basic example To abandon a staged schema change, regardless of its status: ```bash fauna schema abandon ``` By default, the command returns a semantic diff containing the staged schema changes and requires a confirmation to abandon the changes: ```bash Connected to endpoint: cloud-us * Modifying collection `Customer` at collections.fsl:3:1: * Indexes: + add index `byEmail` ? Abandon these changes? (y/N) ``` ### [](#abandon-a-staged-schema-change-without-input)Abandon a staged schema change without input Use the `--no-input` option or its `-y` alias to abandon a staged schema change without prompting for confirmation or displaying a diff. This is useful for using the command programmatically, such as in a CI/CD workflow. ```bash fauna schema abandon --no-input ``` Or: ```bash fauna schema abandon -y ``` ### [](#staged)Run a staged schema change You use the `schema abandon` command as part of a [staged schema change](../../../../../../learn/schema/manage-schema/#staged). A staged schema change lets you change one or more [collection schema](../../../../../../reference/fsl/collection/) without index downtime due to [index builds](../../../../../../learn/data-model/indexes/#builds). To run a staged schema change, you must use the [Fauna CLI](../../../../) or the Fauna Core HTTP API’s [Schema endpoints](../../../../../../reference/http/reference/core-api/#tag/Schema). You can’t run a staged schema change using [FQL schema methods](../../../../../../learn/schema/manage-schema/#fql) or the [Fauna Dashboard](https://dashboard.fauna.com/). To run a staged schema change using the Fauna CLI: 1. Make the desired changes to `.fsl` files in your schema directory. 2. Use [`fauna schema push`](../../../../v4/commands/schema/push/) to stage the schema changes. [`fauna schema push`](../../../../v4/commands/schema/push/) stages schema changes by default: ```cli # Replace 'us' with your preferred region group: # 'us' (United States), 'eu' (Europe), or `global`. # Replace 'my_db' with your database's name. fauna schema push \ --database us/my_db \ --dir /path/to/schema/dir ``` A database can have one staged schema change at a time. You can update staged schema using [`fauna schema push`](../../../../v4/commands/schema/push/). When a database has staged schema, any access or updates done using FQL’s schema commands on related [system collections](../../../../../../learn/data-model/collections/#system-coll) interact with the staged schema, not the database’s active schema. For example, when schema changes are staged, [`Collection.all()`](../../../../../../reference/fql-api/collection/static-all/) returns `Collection` documents for the staged collection schema, not the database’s `Collection` documents. If a database has staged schema, you can’t edit the database’s active schema using FQL, the [Dashboard](https://dashboard.fauna.com/), or an [unstaged schema change](../../../../../../learn/schema/manage-schema/#unstaged). You must first [abandon the staged schema change](#abandon). 3. Use [`fauna schema status`](../../../../v4/commands/schema/status/) to check the status of the staged schema: ```cli fauna schema status \ --database us/my_db ``` Possible statuses: | Staged status | Description | | --- | --- | --- | --- | | pending | Changes are being processed. New indexes are still being built. | | ready | All indexes have been built. Changes are ready to commit. | | failed | There was an error during the staging process. | 4. When the status is `ready`, use [`fauna schema commit`](../../../../v4/commands/schema/commit/) to apply the staged schema to the database: ```cli fauna schema commit \ --database us/my_db ``` You can only commit staged schema with a status of `ready`. If you no longer wish to apply the staged schema or if the status is `failed`, use [`fauna schema abandon`](../../../../v4/commands/schema/abandon/) to unstage the schema: ```cli fauna schema abandon \ --database us/my_db ``` ## [](#see-also)See also [`fauna schema push`](../push/) [`fauna schema status`](../status/) [`fauna schema commit`](../commit/) # `fauna schema commit` Applies a [staged schema change](../../../../../../learn/schema/manage-schema/#staged) to a database. ## [](#syntax)Syntax fauna schema commit \[-y | --no-input\] \[--\[no-\]color \] \[--dir \]   \[--endpoint \] \[--environment \] \[--help\]   \[--secret \] \[--timeout \] \[--url \] ## [](#description)Description `fauna schema commit` command applies a [staged schema change](#staged) to a database. ### [](#staged-schema-status)Staged schema status You can only apply a staged schema that has a status of `ready`. You can check a staged schema’s status using the [`fauna schema status`](../status/) command. If you run `fauna schema commit` and the staged schema is not `ready`, the command returns an error: ```bash › Error: Schema is not ready to be committed ``` ### [](#no-staged-schema-change)No staged schema change If the database has no staged schema, the command returns an error: ```bash › Error: There is no staged schema to commit ``` To stage a schema change, use the [`fauna schema push`](../push/) command. See [Run a staged schema change](#staged). ## [](#arguments)Arguments None ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | -y, --no-input | Commit the change without confirmation input. | | --[no-]color | Enables or disables color formatting for the output. Color formatting is enabled by default if the terminal supports it (determined using chalk/supports-color). Use --no-color to disable. | | --dir | A local directory of .fsl files. Defaults to the directory specified in .fauna-project. | | --endpoint | Fauna server endpoint. | | --environment | Environment to use from .fauna-project. | | --help | Help for the command. | | --secret | Authentication secret. Overrides the secret in .fauna-shell.Use a scoped key to interact with a child database using a parent database’s admin key.For example, with a parent database’s admin key secret of fn123, you can access a child database by appending the child database name and role: fn123:childDB:admin. | | --timeout | Connection timeout in milliseconds. | | --url | Database URL. Overrides the url in .fauna-shell. | ## [](#examples)Examples ### [](#basic-example)Basic example To commit a staged schema change with a `ready` status: ```bash fauna schema commit ``` By default, the command returns a semantic diff of the staged schema changes and requires a confirmation to commit the changes: ```bash Connected to endpoint: cloud-us * Modifying collection `Customer` at collections.fsl:3:1: * Indexes: + add index `byName` ? Accept and commit these changes? (y/N) ``` ### [](#commit-a-staged-schema-change-without-input)Commit a staged schema change without input Use the `--no-input` option or its `-y` alias to commit a staged schema change without prompting for confirmation or displaying a diff. This is useful for using the command programmatically, such as in a CI/CD workflow. ```bash fauna schema commit --no-input ``` Or: ```bash fauna schema commit -y ``` ### [](#staged)Run a staged schema change You use the `schema commit` command as part of a [staged schema change](../../../../../../learn/schema/manage-schema/#staged). A staged schema change lets you change one or more [collection schema](../../../../../../reference/fsl/collection/) without index downtime due to [index builds](../../../../../../learn/data-model/indexes/#builds). To run a staged schema change, you must use the [Fauna CLI](../../../../) or the Fauna Core HTTP API’s [Schema endpoints](../../../../../../reference/http/reference/core-api/#tag/Schema). You can’t run a staged schema change using [FQL schema methods](../../../../../../learn/schema/manage-schema/#fql) or the [Fauna Dashboard](https://dashboard.fauna.com/). To run a staged schema change using the Fauna CLI: 1. Make the desired changes to `.fsl` files in your schema directory. 2. Use [`fauna schema push`](../../../../v4/commands/schema/push/) to stage the schema changes. [`fauna schema push`](../../../../v4/commands/schema/push/) stages schema changes by default: ```cli # Replace 'us' with your preferred region group: # 'us' (United States), 'eu' (Europe), or `global`. # Replace 'my_db' with your database's name. fauna schema push \ --database us/my_db \ --dir /path/to/schema/dir ``` A database can have one staged schema change at a time. You can update staged schema using [`fauna schema push`](../../../../v4/commands/schema/push/). When a database has staged schema, any access or updates done using FQL’s schema commands on related [system collections](../../../../../../learn/data-model/collections/#system-coll) interact with the staged schema, not the database’s active schema. For example, when schema changes are staged, [`Collection.all()`](../../../../../../reference/fql-api/collection/static-all/) returns `Collection` documents for the staged collection schema, not the database’s `Collection` documents. If a database has staged schema, you can’t edit the database’s active schema using FQL, the [Dashboard](https://dashboard.fauna.com/), or an [unstaged schema change](../../../../../../learn/schema/manage-schema/#unstaged). You must first [abandon the staged schema change](#abandon). 3. Use [`fauna schema status`](../../../../v4/commands/schema/status/) to check the status of the staged schema: ```cli fauna schema status \ --database us/my_db ``` Possible statuses: | Staged status | Description | | --- | --- | --- | --- | | pending | Changes are being processed. New indexes are still being built. | | ready | All indexes have been built. Changes are ready to commit. | | failed | There was an error during the staging process. | 4. When the status is `ready`, use [`fauna schema commit`](../../../../v4/commands/schema/commit/) to apply the staged schema to the database: ```cli fauna schema commit \ --database us/my_db ``` You can only commit staged schema with a status of `ready`. If you no longer wish to apply the staged schema or if the status is `failed`, use [`fauna schema abandon`](../../../../v4/commands/schema/abandon/) to unstage the schema: ```cli fauna schema abandon \ --database us/my_db ``` ## [](#see-also)See also [`fauna schema push`](../push/) [`fauna schema status`](../status/) [`fauna schema abandon`](../abandon/) # `fauna schema diff` Prints the diff between local, [staged](../../../../../../learn/schema/manage-schema/#staged), or active schema. ## [](#syntax)Syntax fauna schema diff \[--active\] \[--\[no-\]color \] \[--dir \]   \[--endpoint \] \[--environment \] \[--help\]   \[--secret \] \[--staged\] \[--text\] \[--timeout \]   \[--url \] ## [](#description)Description `fauna schema diff` prints the diff between local and remote schema. See [Examples](#examples). ## [](#arguments)Arguments None ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | --active | Show the diff between the local and remote active schema. | | --[no-]color | Enables or disables color formatting for the output. Color formatting is enabled by default if the terminal supports it (determined using chalk/supports-color). Use --no-color to disable. | | --dir | A local directory of .fsl files to compare. Defaults to the directory specified in .fauna-project. | | --endpoint | Fauna server endpoint. | | --environment | Environment to use from .fauna-project. | | --help | Help for schema diff command. | | --secret | Authentication secret. Overrides the secret in .fauna-shell.Use a scoped key to interact with a child database using a parent database’s admin key.For example, with a parent database’s admin key secret of fn123, you can access a child database by appending the child database name and role: fn123:childDB:admin. | | --staged | Show the diff between the remote active and remote staged schema. | | --text | Display the text diff instead of the semantic diff. | | --timeout | Connection timeout in milliseconds. | | --url | Database URL. Overrides the url in .fauna-shell. | ## [](#examples)Examples ### [](#compare-local-to-staged-schema)Compare local to staged schema ```bash fauna schema diff ``` This compares the project’s local schema to the database’s remote [staged schema](../../../../../../learn/schema/manage-schema/#staged): ``` Connected to endpoint: cloud-us Differences from the remote, staged schema to the local schema: * Modifying collection `Customer` at collections.fsl:1:1: * Indexes: + add index `byEmail` ``` If the database has no staged schema, the command compares the database’s local schema to the remote active schema instead: ``` Connected to endpoint: cloud-us Differences from the remote schema to the local schema: * Modifying collection `Customer` at collections.fsl:1:1 (previously defined at main.fsl:4:1): * Indexes: + add index `sortedByName` ``` ### [](#compare-local-to-active-schema)Compare local to active schema ```bash fauna schema diff --active ``` ``` Connected to endpoint: cloud-us Differences from the remote, active schema to the local schema: * Modifying collection `Customer` at collections.fsl:4:1: * Indexes: + add index `byEmail` ``` ### [](#compare-active-to-staged-schema)Compare active to staged schema ```bash fauna schema diff --staged ``` ``` Connected to endpoint: cloud-us Differences from the remote, active schema to the remote, staged schema: * Modifying collection `Customer` at collections.fsl:4:1: * Indexes: + add index `byEmail` ``` ### [](#display-text-diff-instead-of-semantic-diff)Display text diff instead of semantic diff Use `--text` to show a textual difference instead of a semantic one: ```bash fauna schema diff --text ``` ``` Connected to endpoint: cloud-us Differences from the remote, staged schema to the local schema: collections.fsl @ line 1 to 7 collection Customer { email: String + + index byEmail { + terms [.email] + } } ``` ### [](#specify-a-schema-directory)Specify a schema directory Use `--dir` to specify a schema directory containing local `.fsl` schema files: ```bash fauna schema diff --dir schemas/myschema ``` ## [](#see-also)See also [`fauna schema push`](../push/) # `fauna schema pull` Pull a database’s remote `.fsl` schema files into a local schema directory. ## [](#syntax)Syntax fauna schema pull \[--active\] \[--\[no-\]color \] \[--delete\]   \[--dir \] \[--endpoint \] \[--environment \]   \[--help\] \[--secret \] \[--timeout \]   \[--url \] ## [](#description)Description `fauna schema pull` pulls a database’s remote `.fsl` schema files into a local schema directory. If the database has [staged schema](../../../../../../learn/schema/manage-schema/#staged), the command pulls the database’s remote staged schema by default. If the database has no staged schema, the command pulls the database’s remote schema. To pull the remote active schema regardless of whether schema is staged, use the `--active` option. See [Pull active schema files](#active). You can check whether a database has staged schema using [`fauna schema status`](../status/). ## [](#arguments)Arguments None ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | --active | Pulls remote active schema files for the database into the local schema directory. | | --[no-]color | Enables or disables color formatting for the output. Color formatting is enabled by default if the terminal supports it (determined using chalk/supports-color). Use --no-color to disable. | | --delete | Delete .fsl files in the local directory that aren’t part of the database schema. | | --dir | A local directory to pull .fsl files into. Defaults to the directory specified in .fauna-project. | | --endpoint | Fauna server endpoint. | | --environment | Environment to use from .fauna-project. | | --help | Help for schema pull command. | | --secret | Authentication secret. Overrides the secret in .fauna-shell.Use a scoped key to interact with a child database using a parent database’s admin key.For example, with a parent database’s admin key secret of fn123, you can access a child database by appending the child database name and role: fn123:childDB:admin. | | --timeout | Connection timeout in milliseconds. | | --url | Database URL. Overrides the url in .fauna-shell. | ## [](#examples)Examples ### [](#basic-example)Basic example ```bash fauna schema pull ``` If the database has [staged schema](../../../../../../learn/schema/manage-schema/#staged), the command pulls the database’s remote staged schema. If the database has no staged schema, the command pulls the database’s remote schema. ### [](#active)Pull active schema files Use the `--active` option to pull the database’s remote active schema regardless of whether the database has [staged schema](../../../../../../learn/schema/manage-schema/#staged): ```bash fauna schema pull --active ``` #### [](#delete-local)Delete local files The `schema pull` command overwrites existing schema files in the local directory. If wanted, you can use the `--delete` option to delete local `.fsl` files that aren’t part of the remote schema: ```bash fauna schema pull --delete ``` ## [](#see-also)See also [`fauna schema diff`](../diff/) [`fauna schema push`](../push/) # `fauna schema push` Push a local directory of `.fsl` schema files to Fauna. By default, [stages a schema change](#staged). ## [](#syntax)Syntax fauna schema push \[-y | --no-input\] \[--active\] \[--\[no-\]color \]   \[--dir \] \[--endpoint \] \[--environment \]   \[--help\] \[--secret \] \[--timeout \] \[--url \] ## [](#description)Description `fauna schema push` pushes a local directory of `.fsl` files to Fauna. Valid FSL filenames must use the `.fsl` extension and can’t start with `*`. ### [](#staged-schema-changes)Staged schema changes To avoid index downtime due to [index builds](../../../../../../learn/data-model/indexes/#builds), `schema push` stages schema changes by default. See [Run a staged schema change](#staged). ### [](#unstaged-schema-changes)Unstaged schema changes To immediately commit schema changes without staging, use the `--active` option. See [Immediately apply an unstaged schema change](#unstaged). ### [](#concurrent)Avoid concurrent schema changes Concurrent [unstaged schema changes](../../../../../../learn/schema/manage-schema/#unstaged) can cause [contended transactions](../../../../../../learn/transactions/contention/), even if the changes affect different resources. This includes unstaged changes made using: * The [Fauna CLI](../../../../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../../../../reference/http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../../../../learn/schema/manage-schema/#fql) A schema change triggers a [transaction that validates the entire database schema](../../../../../../learn/schema/#validation). To avoid errors, do one of the following instead: * Run [staged schema changes](../../../../../../learn/schema/manage-schema/#staged) * Perform unstaged schema changes sequentially ## [](#arguments)Arguments None ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | -y, --no-input | Push the change without confirmation input. | | --active | Skip staging the schema and make the schema active immediately. If the schema includes index definition changes, the related indexes may become temporarily unavailable due to index builds. | | --[no-]color | Enables or disables color formatting for the output. Color formatting is enabled by default if the terminal supports it (determined using chalk/supports-color). Use --no-color to disable. | | --dir | A local directory of .fsl files to push. Defaults to the directory specified in .fauna-project. | | --endpoint | Fauna server endpoint. | | --environment | Environment to use from .fauna-project. | | --help | Help for schema push command. | | --secret | Authentication secret. Overrides the secret in .fauna-shell.Use a scoped key to interact with a child database using a parent database’s admin key.For example, with a parent database’s admin key secret of fn123, you can access a child database by appending the child database name and role: fn123:childDB:admin. | | --timeout | Connection timeout in milliseconds. | | --url | Database URL. Overrides the url in .fauna-shell. | ## [](#examples)Examples ### [](#staged)Run a staged schema change A staged schema change lets you change one or more [collection schema](../../../../../../reference/fsl/collection/) without index downtime due to [index builds](../../../../../../learn/data-model/indexes/#builds). To run a staged schema change, you must use the [Fauna CLI](../../../../) or the Fauna Core HTTP API’s [Schema endpoints](../../../../../../reference/http/reference/core-api/#tag/Schema). You can’t run a staged schema change using [FQL schema methods](../../../../../../learn/schema/manage-schema/#fql) or the [Fauna Dashboard](https://dashboard.fauna.com/). To run a staged schema change using the Fauna CLI: 1. Make the desired changes to `.fsl` files in your schema directory. 2. Use [`fauna schema push`](../../../../v4/commands/schema/push/) to stage the schema changes. [`fauna schema push`](../../../../v4/commands/schema/push/) stages schema changes by default: ```cli # Replace 'us' with your preferred region group: # 'us' (United States), 'eu' (Europe), or `global`. # Replace 'my_db' with your database's name. fauna schema push \ --database us/my_db \ --dir /path/to/schema/dir ``` A database can have one staged schema change at a time. You can update staged schema using [`fauna schema push`](../../../../v4/commands/schema/push/). When a database has staged schema, any access or updates done using FQL’s schema commands on related [system collections](../../../../../../learn/data-model/collections/#system-coll) interact with the staged schema, not the database’s active schema. For example, when schema changes are staged, [`Collection.all()`](../../../../../../reference/fql-api/collection/static-all/) returns `Collection` documents for the staged collection schema, not the database’s `Collection` documents. If a database has staged schema, you can’t edit the database’s active schema using FQL, the [Dashboard](https://dashboard.fauna.com/), or an [unstaged schema change](../../../../../../learn/schema/manage-schema/#unstaged). You must first [abandon the staged schema change](#abandon). 3. Use [`fauna schema status`](../../../../v4/commands/schema/status/) to check the status of the staged schema: ```cli fauna schema status \ --database us/my_db ``` Possible statuses: | Staged status | Description | | --- | --- | --- | --- | | pending | Changes are being processed. New indexes are still being built. | | ready | All indexes have been built. Changes are ready to commit. | | failed | There was an error during the staging process. | 4. When the status is `ready`, use [`fauna schema commit`](../../../../v4/commands/schema/commit/) to apply the staged schema to the database: ```cli fauna schema commit \ --database us/my_db ``` You can only commit staged schema with a status of `ready`. If you no longer wish to apply the staged schema or if the status is `failed`, use [`fauna schema abandon`](../../../../v4/commands/schema/abandon/) to unstage the schema: ```cli fauna schema abandon \ --database us/my_db ``` ### [](#unstaged)Immediately apply an unstaged schema change To apply schema changes immediately without staging, use the `--active` option: ```bash fauna schema push --active ``` Schema changes that trigger an [index build](../../../../../../learn/data-model/indexes/#builds) may result in downtime where the index is not queryable. ### [](#push-a-schema-change-without-input)Push a schema change without input Use the `--no-input` option or its `-y` alias to push a schema change without prompting for confirmation or displaying a diff. This is useful for using the command programmatically, such as in a CI/CD workflow. ```bash fauna schema push --no-input ``` Or: ```bash fauna schema push -y ``` ### [](#specify-a-schema-directory)Specify a schema directory Use `--dir` to specify a schema directory containing schema changes: ```bash fauna schema push --dir schema/myschema ``` ## [](#see-also)See also [`fauna schema status`](../status/) [`fauna schema abandon`](../abandon/) [`fauna schema commit`](../commit/) [`fauna schema diff`](../diff/) [`fauna schema pull`](../pull/) # `fauna schema status` Prints the index build status for a [staged schema change](../../../../../../learn/schema/manage-schema/#staged). ## [](#syntax)Syntax fauna schema status \[--\[no-\]color \] \[--dir \]   \[--endpoint \] \[--environment \] \[--help\]   \[--secret \] \[--timeout \] \[--url \] ## [](#description)Description `fauna schema status` prints the [index build](../../../../../../learn/data-model/indexes/#builds) status for a staged schema change. See [Run a staged schema change](#staged). ## [](#arguments)Arguments None ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | --[no-]color | Enables or disables color formatting for the output. Color formatting is enabled by default if the terminal supports it (determined using chalk/supports-color). Use --no-color to disable. | | --dir | A local directory of .fsl files. Defaults to the directory specified in .fauna-project. | | --endpoint | Fauna server endpoint. | | --environment | Environment to use from .fauna-project. | | --help | Help for the command. | | --secret | Authentication secret. Overrides the secret in .fauna-shell.Use a scoped key to interact with a child database using a parent database’s admin key.For example, with a parent database’s admin key secret of fn123, you can access a child database by appending the child database name and role: fn123:childDB:admin. | | --timeout | Connection timeout in milliseconds. | | --url | Database URL. Overrides the url in .fauna-shell. | ## [](#examples)Examples ### [](#basic-example)Basic example To check the status of staged schema: ```bash fauna schema status ``` The response includes a status for the staged schema and a summary of any staged or local changes: ```bash Connected to endpoint: cloud-us Staged status: ready Staged changes: * Modifying collection `Customer` at collections.fsl:4:1 (use `fauna schema commit` to commit staged changes) Local changes: * Modifying collection `Customer` at collections.fsl:4:1 (use `fauna schema diff` to display local changes) (use `fauna schema push` to stage local changes) ``` ### [](#staged)Run a staged schema change You typically use the `schema status` command as part of a [staged schema change](../../../../../../learn/schema/manage-schema/#staged). A staged schema change lets you change one or more [collection schema](../../../../../../reference/fsl/collection/) without index downtime due to [index builds](../../../../../../learn/data-model/indexes/#builds). To run a staged schema change, you must use the [Fauna CLI](../../../../) or the Fauna Core HTTP API’s [Schema endpoints](../../../../../../reference/http/reference/core-api/#tag/Schema). You can’t run a staged schema change using [FQL schema methods](../../../../../../learn/schema/manage-schema/#fql) or the [Fauna Dashboard](https://dashboard.fauna.com/). To run a staged schema change using the Fauna CLI: 1. Make the desired changes to `.fsl` files in your schema directory. 2. Use [`fauna schema push`](../../../../v4/commands/schema/push/) to stage the schema changes. [`fauna schema push`](../../../../v4/commands/schema/push/) stages schema changes by default: ```cli # Replace 'us' with your preferred region group: # 'us' (United States), 'eu' (Europe), or `global`. # Replace 'my_db' with your database's name. fauna schema push \ --database us/my_db \ --dir /path/to/schema/dir ``` A database can have one staged schema change at a time. You can update staged schema using [`fauna schema push`](../../../../v4/commands/schema/push/). When a database has staged schema, any access or updates done using FQL’s schema commands on related [system collections](../../../../../../learn/data-model/collections/#system-coll) interact with the staged schema, not the database’s active schema. For example, when schema changes are staged, [`Collection.all()`](../../../../../../reference/fql-api/collection/static-all/) returns `Collection` documents for the staged collection schema, not the database’s `Collection` documents. If a database has staged schema, you can’t edit the database’s active schema using FQL, the [Dashboard](https://dashboard.fauna.com/), or an [unstaged schema change](../../../../../../learn/schema/manage-schema/#unstaged). You must first [abandon the staged schema change](#abandon). 3. Use [`fauna schema status`](../../../../v4/commands/schema/status/) to check the status of the staged schema: ```cli fauna schema status \ --database us/my_db ``` Possible statuses: | Staged status | Description | | --- | --- | --- | --- | | pending | Changes are being processed. New indexes are still being built. | | ready | All indexes have been built. Changes are ready to commit. | | failed | There was an error during the staging process. | 4. When the status is `ready`, use [`fauna schema commit`](../../../../v4/commands/schema/commit/) to apply the staged schema to the database: ```cli fauna schema commit \ --database us/my_db ``` You can only commit staged schema with a status of `ready`. If you no longer wish to apply the staged schema or if the status is `failed`, use [`fauna schema abandon`](../../../../v4/commands/schema/abandon/) to unstage the schema: ```cli fauna schema abandon \ --database us/my_db ``` ## [](#see-also)See also [`fauna schema push`](../push/) [`fauna schema status`](./) [`fauna schema abandon`](../abandon/) [`fauna schema commit`](../commit/) # `fauna shell` Start an interactive Fauna session to run queries. ## [](#syntax)Syntax fauna shell \[\] \[--endpoint \] \[--environment \]   \[--file \] \[--format json|json-tagged|shell\] \[--output \]   \[--secret \] \[--stdin\] \[--timeout \] \[--typecheck\]   \[--url \] \[--version 4|10\] ## [](#description)Description The `shell` command starts an interactive query shell for sending Fauna database queries. By default, this command supports FQL v10 queries. For FQL v4 queries, use the `--version 4` option. You can’t use this command to run queries against a parent or peer database. ## [](#arguments)Arguments | Argument | Description | | --- | --- | --- | --- | | | Database path. | ## [](#options)Options | Option | Description | | --- | --- | --- | --- | | --endpoint | Connection endpoint from .fauna-shell. | | --environment | Environment to use, from a Fauna project. | | --file | Name of file with queries to run. | | --format | Output format:     json     json-tagged     shell | | --help | Help for run-queries command. | | --output | File to write output to. | | --secret | Authentication secret. Overrides the secret in .fauna-shell.Use a scoped key to interact with a child database using a parent database’s admin key.For example, with a parent database’s admin key secret of fn123, you can access a child database by appending the child database name and role: fn123:childDB:admin. | | --stdin | Read file input from stdin. Write to stdout by default. | | --timeout | Connection timeout (milliseconds). | | --typecheck | Enable typechecking. | | --url | Database URL. Overrides the URL in .fauna-shell. | | --version | FQL Version:     4 = FQL version 4     10 = (default) FQL version 10See FQL v4 access. | ## [](#example)Example Start the shell: ```bash fauna shell ``` At the prompt, run a query to list all collections using the FQL `Collection.all()` method: ```fql Collection.all() ``` ``` { data: [ { name: "Customer", coll: Collection, ts: Time("2099-07-30T22:22:32.945Z"), constraints: [ { unique: [ { field: ".email", mva: false } ], status: "active" } ], fields: { name: { signature: "String" }, email: { signature: "String" }, address: { signature: "{ street: String, city: String, state: String, postalCode: String, country: String }" } }, history_days: 0, indexes: { byEmail: { terms: [ { field: ".email", mva: false } ], queryable: true, status: "complete" } }, computed_fields: { cart: { body: "(customer) => Order.byCustomerAndStatus(customer, \"cart\").first()", signature: "Order?" }, orders: { body: "(customer) => Order.byCustomer(customer)", signature: "Set" } } }, ... ] } ``` # Tools This section describes the Fauna tools for development, debugging, monitoring, and control. ## [](#in-this-section)In this section [Develop locally using Docker](docker/) A Docker image that runs a single Fauna node in your environment, for development and testing. [Visual Studio Code extension](vs-code/) The Fauna Visual Studio Code extension provides rich support for the Fauna Query Language (FQL) and allows you to run queries, featuring autocompletion and awareness of your database environment. # Develop locally using Docker Fauna Dev is a Docker image that runs a single Fauna node in your environment, which you can use for development and testing. Fauna Dev isn’t licensed or supported for a multinode cluster. ## [](#reqs)Requirements * [Docker](https://www.docker.com/) * CPU: * Dual-core * x86, AMD64, or ARM64 * 2GHz clock speed * RAM: 8 GB * Storage: Local, block-based storage device: * SSD * hard disk * EBS * iSCSI Network file systems, such as CIFS and NFS, aren’t supported. ## [](#install)Installation 1. Pull the latest Fauna Docker container: ```bash docker pull fauna/faunadb:latest ``` 2. Verify the installation: ```bash docker run fauna/faunadb --help ``` ## [](#ports)Port assignment When you run Fauna Dev, you must expose the ports for the services running inside the Docker container so they’re accessible. The most commonly used Fauna Dev database service port, which is used in the following examples, is port 8443. If you change the Docker command to use a different port, ensure that your client programs use the same port. The `docker` command `-p` option allows you to map your host computer port to the container port using the `hostPort:containerPort` syntax. `hostPort` and `containerPort` can be a single port number or a range expressed as `low-high`. For example, to connect your host computer port 1234 to container port 6789, use `-p 1234:6789`. See the [Docker docs](https://docs.docker.com/engine/reference/run/#expose-incoming-ports) for more information. ## [](#run)Run Fauna Dev The following describes several approaches to running Fauna Dev with Docker. Note that configuration documentation isn’t provided because on-premise production use isn’t licensed. ### [](#single-developer-node-with-ephemeral-data)Single developer node with ephemeral data This command starts a Fauna Dev node and initializes a single-node cluster, which is useful for testing when the database starts in a known state. ```bash docker run --rm --name faunadb -p 8443:8443 -p 8084:8084 fauna/faunadb ``` Using this command when the Docker container is stopped or killed causes all the data to be lost. ### [](#single-developer-node-with-persisted-data)Single developer node with persisted data This command starts Fauna Dev with a folder or volume bound to the Docker container data folder. ```bash docker run --rm --name faunadb -p 8443:8443 -p 8084:8084 \ -v :/var/lib/faunadb \ fauna/faunadb ``` When the Docker container is stopped or killed, all your data is persisted in the given folder or volume. ### [](#single-developer-node-with-persisted-data-and-logs)Single developer node with persisted data and logs This command starts Fauna Dev, binding a local folder or volume to the Docker container data folder and another local folder or volume to the Docker container log folder. ```bash docker run --rm --name faunadb -p 8443:8443 -p 8084:8084 \ -v :/var/lib/faunadb \ -v :/var/log/faunadb \ fauna/faunadb ``` When the Docker container is stopped or killed, all of your data and logs are maintained in the given folders or volumes. ### [](#managed-configuration)Managed configuration This command starts Fauna Dev with path binds for the data, log, and configuration file and indicates that the non-default configuration should be used. ```bash docker run --rm --name faunadb -p 8443:8443 -p 8084:8084 \ -v :/var/lib/faunadb \ -v :/var/log/faunadb \ -v :/etc/faunadb.yml \ fauna/faunadb --config /etc/faunadb.yml ``` Example: ```yaml --- auth_root_key: secret cluster_name: fauna storage_data_path: /var/lib/faunadb log_path: /var/log/faunadb shutdown_grace_period_seconds: 0 network_listen_address: 172.17.0.2 network_broadcast_address: 172.17.0.2 network_admin_http_address: 172.17.0.2 network_coordinator_http_address: 172.17.0.2 ``` ## [](#connect)Connect To connect to a Fauna Dev instance using the [Fauna CLI](../../cli/v4/) or a Fauna [driver](../../drivers/), set the following options: | Option | Recommended value | Description | | --- | --- | --- | --- | --- | | secret | Saved secret. | The secret required to connect to a Fauna Dev instance.Fauna Dev instances include a top-level key with the admin role. By default, the key’s secret is secret. If you run Fauna Dev with your own faunadb.yml configuration, you can use auth_root_key to specify the secret to use.You can use the top-level key as a scoped key to access all databases for the instance. | | domain | localhost | fauna-shell, JavaScript, and Python only. | | port | 8443 | fauna-shell, JavaScript, and Python only. | | scheme | http | fauna-shell, JavaScript, and Python only. | | endpoint | \reference:http///localhost:8443 | Go only. | Note that the recommended values are consistent with the preceding [Run Fauna Dev](#run) description. If you change the configuration, use settings in the configuration file to establish a connection. For more information on configuring a connection to a Fauna instance, see the JavaScript driver [Client Configuration](https://github.com/fauna/fauna-js#client-configuration) README file. # Visual Studio Code extension The Fauna Visual Studio Code extension provides rich support for the Fauna Query Language (FQL) and allows you to run queries, featuring autocompletion and awareness of your database environment. ## [](#prerequisites)Prerequisites * A Fauna account. * Install [Visual Studio Code](https://code.visualstudio.com/), version 1.4.0 or higher. ## [](#install)Install the Fauna extension Go to the [Visual Studio Marketplace](https://marketplace.visualstudio.com/items?itemName=fauna.fauna-vscode) and follow the instructions to install the Fauna extension. ## [](#key)Get an authentication token You need an `admin`\-level key’s secret to access the database in VS Code. See [Keys](../../../learn/security/keys/). If you already have a secret for an existing database, you may skip this step. 1. [Log in](https://dashboard.fauna.com/accounts/login) to your Fauna account using your email and password. 2. Choose the **Explorer** menu item. 3. Expand the region group and click the database you want to use. 4. Hover over the database entry you want to use and click the key icon. 5. In the **Keys** tab, click **Create Key**. 6. Choose **Admin** and, optionally, enter **VS Code** as a **Key Name**. 7. Click **Save**. 8. Copy and save the **Key Secret**, which is needed to complete VS Code setup in the [next step](#setup_key). ## [](#setup_key)Add the authentication token to VS Code Before using the extension, open the extension settings and set your secret to the database you want to run queries against. This can be set globally across all VS Code instances and at the Workspace level to allow for different databases per VS Code project. Configure the Fauna extension for VS Code to access the Fauna database using the authentication token from the [preceding step](#key). 1. In the VS Code activity bar, select the Fauna icon. 2. Click **Configure Fauna Extension**. 3. In the **Fauna: Db Secret** box, paste the saved secret from the [preceding](#key) step. Leave the default **Fauna: Endpoint** unchanged. ## [](#run-fql-queries)Run FQL queries 1. Open the FQL Playground: Toggle the Playground opened or closed using one of the following methods: * `cmd`+`l` on Mac * `ctrl`+`l` on Linux and Windows * **Fauna: Toggle Playground** from the command palette. 2. Enter an FQL query in the edit box. Submit the query by clicking **Fauna: Run Query** at the top right or entering `Cmd`+`Enter` or `Ctrl`+`Enter`. The response is displayed in the output window. You can use .fql files to save any number of queries in your project. ## [](#commands)Commands The following commands are available from the command palette. | Command | Description | Key binding | | --- | --- | --- | --- | --- | | Fauna: Run Query | Run the query in the FQL Playground or active .fql file. | Cmd+EnterCtrl+Enter | | Fauna: Run Query as Document | Run the query in the FQL Playground or active .fql file as the provided document. This is useful if you have a document that has role membership and want to test permission. | | | Fauna: Run Query with Secret | Run the query in the FQL Playground or active .fql files with a given secret instead of using the secret set for the extension. | | | Fauna: Run Query as Role | Run the query in the FQL Playground or active .fql file as the provided role. | | | Fauna: Toggle Playground | Open the FQL Playground if closed or close the FQL Playground if open. This command saves the contents before closing. | cmd+lctrl+l | # Tutorials This section contains step-by-step tutorials that walk you through common Fauna patterns and use cases. ## [](#in-this-section)In this section [Perform basic operations](todo/) Learn Fauna’s data model and common FQL query patterns using code from a sample to-do application. [Set up a project using FSL and the Fauna CLI](project/) Set up an application project using the Fauna CLI and FSL schema files. [Progressively enforce a document type](schema/) Use collection schema and zero-downtime migrations to progressively define and enforce a document structure. [Build an end-user authentication system](auth/) Implement end-user authentication in Fauna using [credentials](../../learn/security/tokens/#credentials) and [user-defined functions (UDFs)](../../learn/schema/user-defined-functions/). [Control access with ABAC](abac/) Control access to data using attribute-based access control (ABAC) and user-defined functions (UDFs). # Perform basic operations Developers often build a to-do app to learn a new database because it is a straightforward project that shows the basic capabilities of the database. In this guide you build a to-do app and learn the following Fauna core concepts. * Create, Read, Update, Delete (CRUD) in Fauna * Query patterns * Database relationships ## [](#create-collections)Create collections 1. Create a new database in the [Fauna Dashboard](https://dashboard.fauna.com/). 2. Use the Dashboard to add the following `User` collection to the database: ```fsl collection User { unique [.email] index byEmail { terms [.email] } } ``` 3. Add the following `Todo` collection: ```fsl collection Todo { index byOwner { terms [.owner] values [.title, .status, .owner] } } ``` ## [](#create-documents)Create documents Run the following FQL code to create a new user document in the `User` collection. ```fql User.create({ name: "William T", email: "ryker@example.com" }) ``` ``` { id: "388327586567028802", coll: User, ts: Time("2099-01-30T06:41:35.660Z"), name: "William T", email: "ryker@example.com" } ``` Next, create two todo items for the user. Run the following FQL code to create these items. ```fql Todo.create({ title: "Get Milk", status: "Pending" }) Todo.create({ title: "Do Laundry", status: "Pending" }) ``` ## [](#read-documents)Read documents You can get all the documents in a collection by calling the `.all()` method. The following code shows how you get all the todos. Run the following code snippet in the Dashboard Shell. ```fql Todo.all() ``` ``` { data: [ { id: "388467097679691849", coll: Todo, ts: Time("2099-01-31T19:39:03.850Z"), title: "Get Milk", status: "Pending" }, { id: "388467097679692873", coll: Todo, ts: Time("2099-01-31T19:39:03.850Z"), title: "Do Laundry", status: "Pending" } ] } ``` To get a document by its ID, you can use the [`collection.byId()`](../../../reference/fql-api/collection/instance-byid/) method. The following code shows how to query a document by ID. In the following FQL code you query the todo item with the ID `388467097679691849`. ```fql Todo.byId('388467097679691849') ``` ``` { id: "388467097679691849", coll: Todo, ts: Time("2099-01-31T19:39:03.850Z"), title: "Get Milk", status: "Pending" } ``` You can also query documents using the `User` collection’s `byEmail` index. You can find the user document where the email field equals `ryker@example.com` with the following FQL code. ```fql User.byEmail("ryker@example.com") ``` ``` { data: [ { id: "388327586567028802", coll: User, ts: Time("2099-01-30T06:41:35.660Z"), name: "William T", email: "ryker@example.com" } ] } ``` The index call returns a [Set](../../../reference/fql-api/set/) of documents. You can use the [`set.first()`](../../../reference/fql-api/set/first/) method if you want to get a single document. Following is an example. In this scenario you get the first result. ```fql User.byEmail("ryker@example.com").first() ``` ``` { id: "388327586567028802", coll: User, ts: Time("2099-01-30T06:41:35.660Z"), name: "William T", email: "ryker@example.com" } ``` ## [](#update-documents)Update documents You use the [`document.update()`](../../../reference/fql-api/document/update/) method to update documents. Following is an example. In the following FQL code, you update the todo item with the ID `388467097679691849` and update the status field to `Done.` ```fql let todo = Todo.byId("388467097679691849") todo?.update({ status: "Done" }) ``` ``` { id: "388467097679691849", coll: Todo, ts: Time("2099-04-03T03:43:10.010Z"), title: "Get Milk", status: "Done" } ``` ## [](#delete-documents)Delete documents To delete a document you run the [`collectionDef.delete()`](../../../reference/fql-api/collection/delete/) method. Following is an example to delete a todo item. ```fql let todo = Todo.byId("388467097679691849") todo?.delete() ``` ``` Todo("388467097679691849") /* deleted */ ``` ## [](#document-relationships)Document relationships The User and Todo collections have a one-to-many relationship. A user can have many todos. To define this relationship, you can create a new field in the Todo collection and have a reference to the user to whom it belongs. The following code establishes a one-to-many relationship between a user and todos. ```fql let user = User.byId("388327586567028802") Todo.create({ title: "Car Wash", status: "Pending", owner: user }) Todo.create({ title: "Pick up the phase modulator", status: "Pending", owner: user }) ``` ``` { id: "388472684753716297", coll: Todo, ts: Time("2099-01-31T21:07:52.100Z"), title: "Pick up the phase modulator", status: "Pending", owner: User("388327586567028802") } ``` To get all the todos for a particular user, you can run the following query. ```fql let user = User.byId("388327586567028802") Todo.byOwner(user) ``` ``` { data: [ { id: "388472684753715273", coll: Todo, ts: Time("2099-01-31T21:07:52.100Z"), title: "Car Wash", status: "Pending", owner: User("388327586567028802") }, { id: "388472684753716297", coll: Todo, ts: Time("2099-01-31T21:07:52.100Z"), title: "Pick up the phase modulator", status: "Pending", owner: User("388327586567028802") } ] } ``` You can also apply further filters. You can run the following query to get all the todos in pending status with the word `Car` in the title. ```fql let user = User.byId("388327586567028802") Todo.byOwner(user) .where(.status == "Pending" && .title.includes("Car")) ``` ``` { data: [ { id: "388472684753715273", coll: Todo, ts: Time("2099-01-31T21:07:52.100Z"), title: "Car Wash", status: "Pending", owner: User("388327586567028802") } ] } ``` You can also retrieve the user information for a to-do item when you query the to-do item. Following is an example. ```fql Todo.byId("388472684753715273") { owner } ``` ``` { owner: { id: "388327586567028802", coll: User, ts: Time("2099-01-30T06:41:35.660Z"), name: "William T", email: "ryker@example.com" } } ``` You can also retrieve selected fields and related entities of the document. If you want the title and status from a todo and the name of the owner, you can do so. Following is an example of such a query. ```fql Todo.byId("388472684753715273") { title, status, owner { name } } ``` ``` { title: "Car Wash", status: "Pending", owner: { name: "William T" } } ``` # Set up a project using FSL and the Fauna CLI | Learn: Schema, Fauna CLI v4 | | --- | --- | --- | This high-level guide shows how to set up a project for an application using the [Fauna CLI](../../cli/v4/) and [FSL files](../../../learn/schema/). While not required, we recommend using this workflow for production apps. The setup lets you manage your database schema as declarative files alongside your app’s code. ## [](#set-up-a-project)Set up a project 1. If you haven’t already, install the Fauna CLI: ```bash npm install -g fauna-shell ``` 2. If you haven’t already, log in to Fauna using the CLI: ```cli fauna login ``` 3. If you haven’t already, create a directory for the project and navigate to it. In most cases, the directory also contains your app’s source code. For example: ```bash mkdir cd ``` 4. Create one or more databases for the app: ```cli # Replace 'us' with your preferred region group: # 'us' (United States), 'eu' (Europe), or `global`. # Replace 'my_db' with your database's name. fauna database create \ --name my_db \ --database us ``` 5. Create and navigate to a schema directory. The directory can use any name. ```cli mkdir schema cd schema ``` 6. In the project’s schema directory, create and save one or more `.fsl` files. For example, you can create a `collections.fsl` file with the following FSL collection schema: ```fsl collection Customer { name: String email: String index byEmail { terms [.email] } unique [.email] } ``` An `.fsl` file can contain schema for multiple resources. You can use multiple `.fsl` files to organize your schema. There is no performance benefit to splitting `.fsl` files or storing larger, individual files. 7. Run a [staged schema change](../../../learn/schema/manage-schema/#staged) to commit the schema to Fauna: 1. Use [`fauna schema push`](../../cli/v4/commands/schema/push/) to stage the schema changes. [`fauna schema push`](../../cli/v4/commands/schema/push/) stages schema changes by default: ```cli fauna schema push \ --database us/my_db ``` A database can have one staged schema change at a time. You can update staged schema using [`fauna schema push`](../../cli/v4/commands/schema/push/). When a database has staged schema, any access or updates done using FQL’s schema commands on related [system collections](../../../learn/data-model/collections/#system-coll) interact with the staged schema, not the database’s active schema. For example, when schema changes are staged, [`Collection.all()`](../../../reference/fql-api/collection/static-all/) returns `Collection` documents for the staged collection schema, not the database’s `Collection` documents. If a database has staged schema, you can’t edit the database’s active schema using FQL, the [Dashboard](https://dashboard.fauna.com/), or an [unstaged schema change](../../../learn/schema/manage-schema/#unstaged). You must first [abandon the staged schema change](#abandon). 2. Use [`fauna schema status`](../../cli/v4/commands/schema/status/) to check the status of the staged schema: ```cli fauna schema status \ --database us/my_db ``` Possible statuses: | Staged status | Description | | --- | --- | --- | --- | | pending | Changes are being processed. New indexes are still being built. | | ready | All indexes have been built. Changes are ready to commit. | | failed | There was an error during the staging process. | 3. When the status is `ready`, use [`fauna schema commit`](../../cli/v4/commands/schema/commit/) to apply the staged schema to the database: ```cli fauna schema commit \ --database us/my_db ``` You can only commit staged schema with a status of `ready`. If you no longer wish to apply the staged schema or if the status is `failed`, use [`fauna schema abandon`](../../cli/v4/commands/schema/abandon/) to unstage the schema: ```cli fauna schema abandon \ --database us/my_db ``` Before pushing changes, the command displays a diff. If wanted, you can then accept or reject the changes. 8. Use [`fauna query`](../../cli/v4/commands/query/) to create a [key](../../../learn/security/keys/) for one or more of your databases: ```cli fauna query "Key.create({ role: 'admin' })" \ --database us/my_db ``` Your app can use the key’s secret to authenticate Fauna requests using a [client driver](../../drivers/) or the [Fauna Core HTTP API](../../../reference/http/reference/core-api/). You can also use the key to bootstrap a Fauna-based [end-user authentication system](../auth/). ## [](#next-steps)Next steps Congratulations! You’ve initialized a project and you’re ready to start building your app with Fauna. To learn how to run queries from your app, check out the [client driver docs](../../drivers/) or the [Query HTTP API endpoint reference](../../../reference/http/reference/core-api/#operation/query). If you’d like to see an example, check out the sample apps on GitHub: ![JavaScript](../../_images/drivers/logos/javascript.svg) [JavaScript](https://github.com/fauna/js-sample-app) ![Python](../../_images/drivers/logos/python.svg) [Python](https://github.com/fauna/python-sample-app) ![C#](../../_images/drivers/logos/csharp.svg) [.NET/C#](https://github.com/fauna/dotnet-sample-app) ![Java](../../_images/drivers/logos/java.svg) [Java](https://github.com/fauna/java-sample-app) # Progressively enforce a document type | Learn: Schema | | --- | --- | --- | Fauna supports [progressive enforcement](../../../learn/schema/#progressive-enforcement) for [document types](../../../learn/schema/#document-type-definitions). Early in an application’s development, you can use collections with a schemaless or permissive document type to allow ad hoc fields in documents. As your application evolves, you can use zero-downtime migrations to add stricter field definitions and to normalize field values. This lets you iteratively move from a permissive document type to a strict document type (or the reverse) based on your application’s needs. In this tutorial, you’ll progressively define and enforce a document type for an example e-commerce application. ## [](#before-you-start)Before you start To complete this tutorial, you’ll need: * The [Fauna CLI](../../cli/v4/) * Familiarity with [Fauna schema](../../../learn/schema/) ## [](#setup)Setup Set up a database for the application. 1. If you haven’t already, log in to Fauna using the CLI: ```cli fauna login ``` 2. Create an `ecommerce` directory and navigate to it: ```bash mkdir ecommerce cd ecommerce ``` 3. Create an `ecommerce` database: ```cli # Replace 'us' with your preferred region group: # 'us' (United States), 'eu' (Europe), or `global`. fauna database create \ --name "ecommerce" \ --database us ``` ## [](#create-a-schemaless-collection)Create a schemaless collection Create a schemaless collection. A schemaless collection has no predefined document fields. Instead, you can include ad hoc fields of any type in the collection’s documents. 1. Create and navigate to the `schema` directory: ```bash mkdir schema cd schema ``` 2. Create a `collections.fsl` file and add the following collection schema to it: ```fsl // Stores product data for the e-commerce application. collection Product { // Contains no field definitions. // The wildcard constraint accepts // ad hoc fields of any type. *: Any // If a collection schema has no field definitions // and no wildcard constraint, it has an implicit // wildcard constraint of `*: Any`. // Defines the `sortedByStock()` index. // Use the index to get `Product` documents sorted by // ascending `stock` field value. index sortedByStock { values [.stock] } } ``` 3. Save `collections.fsl`. Then push the schema to Fauna: ```cli # Replace 'us' with your database's region group. fauna schema push \ --database us/ecommerce ``` When prompted, accept and stage the schema. 4. Check the status of the staged schema: ```cli fauna schema status \ --database us/ecommerce ``` 5. When the status is `ready`, commit the staged schema to the database: ```cli fauna schema commit \ --database us/ecommerce ``` The commit applies the staged schema to the database. This creates the collection. ## [](#add-sample-data)Add sample data Add documents to the `Product` collection. 1. Start a shell session with the `admin` role in the Fauna CLI: ```cli fauna shell \ --database us/ecommerce \ --role admin ``` 2. Enter editor mode to run multi-line queries: ```bash > .editor ``` 3. Run the following FQL query: ```fql // Creates a `Product` collection document. // In a schemaless collection, // documents can contain any field of any type. Product.create({ name: "pinata", description: "Original Classic Donkey Pinata", price: 24_99, stock: 40 }) // In a schemaless collection, a field // can contain different types across documents. Product.create({ name: "cups", description: "Translucent 9 Oz, 100 ct", price: 6_98, // This document's `stock` field uses a // different type than the previous document. stock: "foo", // The previous document didn't include // the `backorder` field. backorder: { limit: 5, backordered: false } }) ``` The query creates all documents but only returns the last document. 4. Press Ctrl+D to exit the shell. ## [](#migrate-to-a-permissive-document-type)Migrate to a permissive document type As application development continues, stricter data requirements typically evolve. In this case, the e-commerce application wants to use the `name` field to fetch specific `Product` documents. Edit the `Product` collection schema to add: * A field definition for the `name` field * A migrations block * A `byName()` index definition 1. Edit `collections.fsl` as follows: ```fsl collection Product { // Field definition for the `name` field. // `name` only accepts `String` values. name: String // Adds the `typeConflicts` field as a catch-all field for // existing `name` values that aren't `String` or `null`. // Because `typeConflicts` is used in a `move_conflicts` statement, // it must have a type of `{ *: Any }?`. typeConflicts: { *: Any }? // The schema now includes field definitions. // The wildcard constraint is required to // accept documents with ad hoc fields. // You can only remove a wildcard constraint // with a migration. *: Any // Instructs Fauna on how to handle the field and // wildcard constraint updates. migrations { // Migration #1 (Current) // Adds the `typeConflicts` field. add .typeConflicts // Adds the `name` field. add .name // Nests non-conforming `name` and `typeConflicts` field // values in the `typeConflicts` field. move_conflicts .typeConflicts // Sets `name` to `"Default"` for existing documents // with a non-string `name` value. backfill .name = "Default" } // Defines the `byName()` index. // Use the index to get `Product` documents by `name` value. index byName { terms [.name] } index sortedByStock { values [.stock] } } ``` 2. Save `collections.fsl`. Then push the schema to Fauna: ```cli fauna schema push \ --database us/ecommerce ``` When prompted, accept and stage the schema. 3. Check the status of the staged schema: ```cli fauna schema status \ --database us/ecommerce ``` 4. When the status is `ready`, commit the staged schema to the database: ```cli fauna schema commit \ --database us/ecommerce ``` This runs the migration. New `Product` collection documents must now contain a `name` field with a [String](../../../reference/fql/types/#string) value. The documents can also contain other ad hoc fields. ### [](#review-existing-data)Review existing data Run a few queries to review changes the migration made to the collection’s documents. 1. Start a shell session: ```cli fauna shell \ --database us/ecommerce --role admin ``` 2. Run the following query: ```fql Product.sortedByStock() ``` The query returns all `Product` collection documents, sorted by ascending `stock` value: ``` { data: [ { id: "400414078704549921", coll: Product, ts: Time("2099-06-11T16:31:12.790Z"), name: "pinata", description: "Original Classic Donkey Pinata", price: 2499, stock: 40 }, { id: "400414078705598497", coll: Product, ts: Time("2099-06-11T16:31:12.790Z"), name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698, stock: "foo", backorder: { limit: 5, backordered: false } } ] } ``` The migration made no changes to existing documents. The documents already contain a `name` field with a [String](../../../reference/fql/types/#string) value. ### [](#add-new-data)Add new data New `Product` collection documents must contain a `name` field with a [String](../../../reference/fql/types/#string) value. Attempt to add some non-conforming documents to the collection. 1. In the shell session, run the following query in editor mode: ```fql // Create a `Product` collection document // without a `name` field. Product.create({ description: "Conventional Hass, 4ct bag", price: 3_99, stock: 95 }) ``` The query returns an error. `Product` collection documents must contain a `name` field. 2. Run the following query in editor mode: ```fql // Create a `Product` collection document // with a non-String `name` value Product.create({ name: 12345, description: "Conventional Hass, 4ct bag", price: 3_99, stock: 95 }) ``` The query returns an error. The `name` field must contain a [String](../../../reference/fql/types/#string). 3. Run the following query in editor mode: ```fql // Create a `Product` collection document // with a `name` String Product.create({ name: "avocados", description: "Conventional Hass, 4ct bag", price: 3_99, stock: 95 }) ``` The query runs successfully and creates a new `Product` document. 4. Press Ctrl+D to exit the shell. ## [](#add-more-field-definitions)Add more field definitions You can perform iterative migrations to add more field definitions. Edit the `Product` collection schema to: * Add field definitions for the `description` and `price` fields * Update the migrations block 1. Edit `collections.fsl` as follows: ```fsl collection Product { name: String // Adds the `description` field. // Only accepts `String` values. description: String // Adds the `price` field. // Only accepts `Int` values. price: Int typeConflicts: { *: Any }? *: Any migrations { // Migration #1 (Previous) // Already run. Fauna ignores // previously run migration statements. add .typeConflicts add .name move_conflicts .typeConflicts backfill .name = "Default" // Migration #2 (Current) // New migration statements. Fauna // only runs these statements. add .description add .price // Uses the existing `typeConflicts` field // as a catch-all field. move_conflicts .typeConflicts backfill .description = "Default" backfill .price = 1 } index byName { terms [.name] } index sortedByStock { values [.stock] } } ``` 2. Save `collections.fsl`. Then push the schema to Fauna: ```cli fauna schema push \ --database us/ecommerce ``` When prompted, accept and stage the schema. 3. Check the status of the staged schema: ```cli fauna schema status \ --database us/ecommerce ``` 4. When the status is `ready`, commit the staged schema to the database: ```cli fauna schema commit \ --database us/ecommerce ``` This runs the migration. New `Product` collection documents must now contain: * A `name` field with a [String](../../../reference/fql/types/#string) value * A `description` field with a [String](../../../reference/fql/types/#string) value * A `price` field with an [Int](../../../reference/fql/types/#int) value The documents can also contain other ad hoc fields. ### [](#review-existing-data-2)Review existing data Run a few queries to review changes the migration made to the collection’s documents. 1. Start a shell session: ```cli fauna shell \ --database us/ecommerce \ --role admin ``` 2. Run the following query: ```fql Product.sortedByStock() ``` The query returns all `Product` collection documents, sorted by ascending `stock` value: ``` { data: [ { id: "400414078704549921", coll: Product, ts: Time("2099-06-11T16:31:12.790Z"), name: "pinata", description: "Original Classic Donkey Pinata", price: 2499, stock: 40 }, ... ] } ``` The migration made no changes to the existing documents. 3. Press Ctrl+D to exit the shell. ## [](#handle-non-conforming-values)Handle non-conforming values Ad hoc fields may contain different types across documents in a collection. You can add a field definition to normalize an existing ad hoc field and narrow its accepted types. A `move_conflicts` migration statement lets you assign non-conforming field values to a catch-all field. Edit the `Product` collection schema to: * Add a field definition for the `stock` field * Update migrations block 1. Edit `collections.fsl` as follows: ```fsl collection Product { name: String description: String price: Int // Adds the `stock` field. // Only accepts `Int` values. stock: Int typeConflicts: { *: Any }? *: Any migrations { // Migration #1 (Previous) // Already run. Ignored. add .typeConflicts add .name move_conflicts .typeConflicts backfill .name = "Default" // Migration #2 (Previous) // Already run. Ignored. add .description add .price move_conflicts .typeConflicts backfill .description = "Default" backfill .price = 1 // Migration #3 (Current) // New migration statements. add .stock // Uses the existing `typeConflicts` field // as a catch-all field. move_conflicts .typeConflicts backfill .stock = 0 } index byName { terms [.name] } index sortedByStock { values [.stock] } } ``` 2. Save `collections.fsl`. Then push the schema to Fauna: ```cli fauna schema push \ --database us/ecommerce ``` When prompted, accept and stage the schema. 3. Check the status of the staged schema: ```cli fauna schema status \ --database us/ecommerce ``` 4. When the status is `ready`, commit the staged schema to the database: ```cli fauna schema commit \ --database us/ecommerce ``` This runs the migration. New `Product` collection documents must now contain: * A `name` field with a [String](../../../reference/fql/types/#string) value * A `description` field with a [String](../../../reference/fql/types/#string) value * A `price` field with an [Int](../../../reference/fql/types/#int) value * A `stock` field with an [Int](../../../reference/fql/types/#int) value The documents can also contain other ad hoc fields. ### [](#review-existing-data-3)Review existing data Run a query to see how the migration affected a document with a non-conforming `stock` value. 1. Start a shell session: ```cli fauna shell \ --database us/ecommerce \ --role admin ``` 2. Run the following query: ```fql Product.byName("cups").first() ``` The query returns: ``` { id: "400414078705598497", coll: Product, ts: Time("2099-06-11T17:24:08.690Z"), name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698, stock: 0, backorder: { limit: 5, backordered: false }, typeConflicts: { stock: "foo" } } ``` During the migration, Fauna nested the document’s non-conforming `stock` value in the `typeConflicts` catch-all field, which was specified by the `move_conflicts` statement. The migration then filled in the specified `backfill` value of `0` so the document conforms to the field definition. 3. Press Ctrl+D to exit the shell. ## [](#migrate-to-a-strict-document-type)Migrate to a strict document type As the application scales and its data model stabilizes, storage costs and data consistency become more important than flexibility. It no longer makes sense to accept and store ad hoc fields. Edit the `Product` collection schema to remove the wildcard constraint. 1. Edit `collections.fsl` as follows: ```fsl collection Product { name: String description: String price: Int stock: Int typeConflicts: { *: Any }? // Removes the `*: Any` wildcard constraint. // The collection no longer accepts // documents with ad hoc fields. migrations { // Migration #1 (Previous) // Already run. Ignored. add .typeConflicts add .name move_conflicts .typeConflicts backfill .name = "Default" // Migration #2 (Previous) // Already run. Ignored. add .description add .price move_conflicts .typeConflicts backfill .description = "Default" backfill .price = 1 // Migration #3 (Previous) // Already run. Ignored. add .stock move_conflicts .typeConflicts backfill .stock = 0 // Migration #4 (Current) // New migration statements. // Moves any existing ad hoc fields to // the `typeConflicts` catch-all field. move_wildcard .typeConflicts } index byName { terms [.name] } index sortedByStock { values [.stock] } } ``` 2. Save `collections.fsl`. Then push the schema to Fauna: ```cli fauna schema push \ --database us/ecommerce ``` When prompted, accept and stage the schema. 3. Check the status of the staged schema: ```cli fauna schema status \ --database us/ecommerce ``` 4. When the status is `ready`, commit the staged schema to the database: ```cli fauna schema commit \ --database us/ecommerce ``` This runs the migration. New `Product` collection documents must now contain: * A `name` field with a [String](../../../reference/fql/types/#string) value * A `description` field with a [String](../../../reference/fql/types/#string) value * A `price` field with an [Int](../../../reference/fql/types/#int) value * A `stock` field with an [Int](../../../reference/fql/types/#int) value The documents can’t contain other fields. ### [](#review-existing-data-4)Review existing data Run a query to see how the migration affected a document that contained an ad hoc field without a field definition. 1. Start a shell session: ```cli fauna shell \ --database us/ecommerce \ --role admin ``` 2. Run the following query in editor mode: ```fql // Gets `Product` collection documents // with a `name` of `cups`. Then get the // first document. Product.byName("cups").first() ``` The query returns: ``` { id: "400414078705598497", coll: Product, ts: Time("2099-06-11T17:24:08.690Z"), name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698, stock: 0, typeConflicts: { stock: "foo", backorder: { limit: 5, backordered: false } } } ``` During the migration, Fauna nested the document’s ad hoc `backorder` field value in the `typeConflicts` catch-all field, which specified by the `move_wildcard` statement. ### [](#attempt-to-add-invalid-data)Attempt to add invalid data Attempt to add non-conforming documents to the `Product` collection. New documents can’t contain ad hoc fields. Field values must conform to their field definitions. 1. In the shell session, run the following query in editor mode: ```fql // Create a `Product` collection document // with an ad hoc `stocked` field. Product.create({ name: "limes", description: "Conventional, 1 ct", price: 35, stock: 100, stocked: true }) ``` The query returns an error. The `Product` collection no longer accepts documents with ad hoc fields. 2. Run the following query in editor mode: ```fql // Create a `Product` collection document // with a missing `stock` field. Product.create({ name: "limes", description: "Conventional, 1 ct", price: 35 }) ``` The query returns an error. The `stock` field must be present in new documents. 3. Run the following query in editor mode: ```fql // Create a `Product` collection document // with all required fields and no ad hoc fields. Product.create({ name: "limes", description: "Conventional, 1 ct", price: 35, stock: 100 }) ``` The query runs successfully and creates a new `Product` document. 4. Press Ctrl+D to exit the shell. # Build an end-user authentication system | Learn: Credentials | | --- | --- | --- | You can use [credentials](../../../learn/security/tokens/#credentials) and [user-defined functions (UDFs)](../../../learn/schema/user-defined-functions/) to create an end-user authentication system. In this tutorial, you’ll build a user authentication system for an example e-commerce application. The system follows the [principle of least privilege](https://en.wikipedia.org/wiki/Principle_of_least_privilege) by: * Limiting the privileges of roles assigned to a user’s tokens * Only allowing unauthenticated users to access data through UDFs ## [](#before-you-start)Before you start To complete this tutorial, you’ll need: * The [Fauna CLI](../../cli/v4/) * Familiarity with the following Fauna features: * [Authentication](../../../learn/security/authentication/) * [Authorization](../../../learn/security/authorization/) * [Roles](../../../learn/security/roles/) ## [](#setup)Setup Set up a database with demo data. 1. Log in to the [Fauna Dashboard](https://dashboard.fauna.com/) and create a database. When creating the database, enable demo data. 2. If you haven’t already, log in to Fauna using the CLI: ```cli fauna login ``` 3. Create an `ecommerce` directory and navigate to it: ```bash mkdir ecommerce cd ecommerce ``` 4. Create a `schema` directory: ```bash mkdir schema ``` 5. Pull the database’s active schema to the local schema directory: ```cli # Replace 'us' with your preferred region group: # 'us' (United States), 'eu' (Europe), or `global`. # Replace 'my_db' with your database's name. fauna schema pull \ --database us/my_db \ --dir ./schema ``` 6. Navigate to the `schema` directory: ```bash cd schema ``` 7. To simplify the tutorial, open `collections.fsl` and edit the `Customer` collection as follows: ```fsl collection Customer { // Make `name` nullable. name: String? email: String // Make `address` nullable. address: { street: String city: String state: String postalCode: String country: String }? // Leave remaining schema as is. ... } ``` ## [](#create-user-login-functions)Create user login functions Create UDFs that let end users: * Sign up as application users by providing an email address and password * Log in to the application using their email and password * Log out of the application 1. In the schema directory, add the following function schema to the end of `functions.fsl`: ```fsl ... // Defines the `UserSignup()` UDF. Unauthenticated users can use // the UDF to create a `Customer` identity document and credential. @role("server") // Runs with the `server` role's privileges function UserSignup(email, password) { // Creates a `Customer` collection document. // The `Customer` document acts as an identity document that // represents the end user. let customer = Customer.create({ email: email }) // Creates a credential that associates the // `Customer` document with an end user's password. let credential = Credential.create({ document: customer, password: password }) // Outputs the credential as an object. Object.assign({ }, credential) } // Defines the `UserLogin()` UDF. Unauthenticated users can use the // UDF to create an token associated with their identity document. @role("server") // Runs with the `server` role's privileges. function UserLogin(email, password) { // Uses the `Customer` collection's `byEmail()` index to // get `Customer` collection documents by `email` field value. // In the `Customer` collection, `email` field values are unique // so return the `first()` (and only) document. let customer = Customer.byEmail(email)?.first() // Gets the credential for the above `Customer` document and // passes it to `login()` to create a token. // The `Customer` document is the token's identity document. // Set the token's `ttl` to 60 minutes from the current time at // query. The token's secret expires at its `ttl`. Credential.byDocument(customer) ?.login(password, Time.now().add(60, "minutes")) } // Defines the `UserLogout()` UDF. Authenticated users can use the // UDF to delete their authentication token. The user creates a // new token when they next log in. function UserLogout() { // `Query.token()` gets the token document for the // query's authentication token. Query.token()!.delete() } ``` ## [](#create-a-user-defined-roles)Create a user-defined roles Unauthenticated users should only be able to sign up or log in to the application. Authenticated users should have read access to the application’s products. Authenticated users should also be able to log out of the application. Update the database as follows: * Create a `customer` role with the ability to call the `UserLogOut()` function. * Create a `unauthenticated` role with the ability to call the `UserSignup()` and `UserLogin()` functions. 1. In the schema directory, create the `roles.fsl` file and add the following schema to it: ```fsl // Defines the `customer` role. role customer { // Assign the `customer` role to tokens with // an identity document in the `Customer` collection. membership Customer // Grants `read` access to `Product` collection documents. privileges Product { read } // Grants the ability to call the `UserLogout()` UDF. privileges UserLogout { call } } // Defines the `unauthenticated` role. role unauthenticated { // Grants the ability to call the `UserSignup()` UDF. privileges UserSignup { call } // Grants the ability to call the `UserLogin()` UDF. privileges UserLogin { call } } ``` 2. Save `collections.fsl`, `functions.fsl`, and `roles.fsl`. Then push the schema to Fauna: ```cli fauna schema push \ --database us/my_db ``` When prompted, accept and stage the schema. 3. Check the status of the staged schema: ```cli fauna schema status \ --database us/my_db ``` 4. When the status is `ready`, commit the staged schema to the database: ```cli fauna schema commit \ --database us/my_db ``` The commit applies the staged schema to the database. This creates the new UDFs and role. ## [](#create-a-key-for-unauthenticated-users)Create a key for unauthenticated users Create a key that’s assigned the `unauthenticated` role. The key acts as a bootstrap to let unauthenticated users sign up or log in to the application. You’d typically store the key’s secret in the `FAUNA_SECRET` environment variable for use in a Fauna client driver. 1. Start a shell session in the Fauna CLI: ```cli fauna shell \ --database us/my_db ``` 2. Run the following FQL query: ```fql Key.create({ role: "unauthenticated" }) ``` Copy the returned key’s `secret`. You’ll use the secret later in the tutorial. 3. Press Ctrl+D to exit the shell. ## [](#test-unauthenticated-user-access)Test unauthenticated user access Use the key’s secret to run queries as an unauthenticated user. The key only has privileges for the login-related UDFs you created earlier. The key shouldn’t have access to product data. 1. Start a new shell session using the key’s secret: ```cli fauna shell \ --secret ``` 2. Enter editor mode to run multi-line queries: ```bash > .editor ``` 3. Run the following query: ```fql // Attempt to read `Product` collection documents. Product.all() ``` The query returns an empty Set: ``` { data: [] } ``` The `unauthenticated` role doesn’t grant access to `Product` collection documents. ## [](#sign-up-and-log-in-as-an-end-user)Sign up and log in as an end user Use the key to create a token tied to a user’s `Customer` identity document. Once created, the application can store the token’s secret and use it to authenticate Fauna queries on behalf of the user. 1. In the shell session, run the following query in editor mode: ```fql // Calls the `UserSignup()` UDF. Passes the `email` for the // `Customer` identity document you created earlier as the first // argument. Passes the user's password as the second argument. UserSignup("john.doe@example.com", "sekret") ``` The results contains a permission-denied message. This is expected. The `unauthenticated` role doesn’t have read access to `Customer` documents. The function call still creates the related credential. 2. Run the following query in editor mode: ```fql // Calls the `UserLogin()` UDF. Passes the same arguments as the // earlier `UserSignup()` UDF call. UserLogin("john.doe@example.com", "sekret") ``` Copy the returned token’s `secret`. You’ll use the secret later in the tutorial. 3. Press Ctrl+D to exit the shell. ## [](#test-authenticated-user-access)Test authenticated user access The user’s token can access product data. Use the token’s secret to access the data and then log out. 1. Start a new shell session using the token secret: ```cli fauna shell \ --secret ``` 2. Run the following query: ```fql Product.all() ``` The query runs successfully: ``` { data: [ { id: "111", coll: Product, ts: Time("2099-07-31T13:29:18.350Z"), name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698, stock: 100, category: Category("123") }, ... ] } ``` 3. Run the following query in editor mode: ```fql // Calls the `UserLogout()` UDF. The call deletes the authentication // token and its secret. Users can create a new token and secret by // logging in again using `UserLogin()`. UserLogout() ``` The query runs successfully and indicates the token was deleted. # Control access with ABAC In Fauna, you implement [attribute-based access control (ABAC)](../../../learn/security/abac/) using role-related predicates. These predicates let you assign roles and grant privileges based on attributes. You can further limit access using [user-defined functions (UDFs)](../../../learn/schema/user-defined-functions/). UDFs give you granular control over the way data is queried and returned. In this tutorial, you’ll use ABAC and UDFs to dynamically control access to medical appointment data. ## [](#before-you-start)Before you start To complete this tutorial, you’ll need: * The [Fauna CLI](../../cli/v4/) * Familiarity with the following Fauna features: * [Authentication](../../../learn/security/authentication/) * [Authorization](../../../learn/security/authorization/) * [Roles](../../../learn/security/roles/) ## [](#setup)Setup Set up a database with sample data and a user-defined role. 1. If you haven’t already, log in to Fauna using the CLI: ```cli fauna login ``` 2. Create an `abac` directory and navigate to it: ```bash mkdir abac cd abac ``` 3. Create an `abac` database: ```cli # Replace 'us' with your preferred region group: # 'us' (United States), 'eu' (Europe), or `global`. fauna database create \ --name abac \ --database us ``` 4. Create and navigate to a `schema` directory: ```cli mkdir schema cd schema ``` 5. In the `schema` directory, create the `collections.fsl` file and add the following schema to it: ```fsl // Stores identity documents for staff end users. collection Staff { // Each `Staff` collection document must have a // unique `name` field value. unique [.name] // Defines the `byName()` index. // Use the index to get `Staff` documents by `name` value. index byName { terms [.name] } } // Stores medical office data. collection Office { unique [.name] index byName { terms [.name] } } // Stores appointment data. collection Appointment { // Defines the `byDateAndOffice()` index. // Use the index to get `Appointment` documents by // `date` and `office` values. index byDateAndOffice { terms [.date, .office] } // Adds a computed `date` field. compute date = (appt => { // The `date` value is derived from each `Appointment` // document's `time` field. let timeStr = appt.time.toString() let dateStr = timeStr.split("T")[0] Date(dateStr) }) } ``` 6. Create `roles.fsl` and add the following schema to it: ```fsl // User-defined role for frontdesk staff. role frontdesk { // Assigns the `frontdesk` role to tokens with // an identity document in the `Staff` collection. membership Staff // Grants `read` access to // `Appointment` collection documents. privileges Appointment { read } // Grants `read` access to // `Office` collection documents. privileges Office { read } } ``` 7. Save `collections.fsl` and `roles.fsl`. Then push the schema to Fauna: ```cli # Replace 'us' with your preferred region group: # 'us' (United States), 'eu' (Europe), or `global`. fauna schema push \ --database us/abac ``` When prompted, accept and stage the schema. 8. Check the status of the staged schema: ```cli fauna schema status \ --database us/abac ``` 9. When the status is `ready`, commit the staged schema to the database: ```cli fauna schema commit \ --database us/abac ``` The commit applies the staged schema to the database. This creates the collections and role. 10. Start a shell session with the `admin` role in the Fauna CLI: ```cli fauna shell \ --database us/abac \ --role admin ``` 11. Enter editor mode to run multi-line queries: ```bash > .editor ``` 12. Run the following FQL query: ```fql // Creates an `Office` collection document. // Stores a reference to the document in // the `acmeOffice` variable. let acmeOffice = Office.create({ name: "Acme Medical" }) let baysideOffice = Office.create({ name: "Bayside Hospital" }) // Creates a `Staff` collection document. Staff.create({ name: "John Doe" }) // Creates an `Appointment` collection document. Appointment.create({ patient: "Alice Appleseed", // Set appointment for now time: Time.now(), reason: "Allergy test", // Sets the previous `Office` collection document // as the `office` field value. office: acmeOffice }) Appointment.create({ patient: "Bob Brown", // Set the appointment for 30 minutes from now. time: Time.now().add(30, "minutes"), reason: "Fever", office: acmeOffice }) Appointment.create({ patient: "Carol Clark", // Set the appointment for 1 hour from now. time: Time.now().add(1, "hours"), reason: "Foot x-ray", office: baysideOffice }) Appointment.create({ patient: "Dave Dennis", // Set the appointment for tomorrow. time: Time.now().add(1, "days"), reason: "Foot x-ray", office: acmeOffice }) ``` The query creates all documents but only returns the last document. ## [](#create-an-authentication-token)Create an authentication token Create a token tied to a user’s `Staff` identity document. You typically create tokens using [credentials](../../../learn/security/tokens/#credentials) and login-related UDFs. For an example, see [Build an end-user authentication system](../auth/). For simplicity, this tutorial uses [`Token.create()`](../../../reference/fql-api/token/create/) to create the token instead. 1. Run the following query in the shell’s editor mode: ```fql // Gets `Staff` collection documents with a `name` of `John Doe`. // Because `name` values are unique, use the first document. let document = Staff.byName("John Doe").first() // Create a token using the `Staff` document // as the identity document. Token.create({ document: document }) ``` Copy the returned token’s `secret`. You’ll use the secret later in the tutorial. 2. Press Ctrl+D to exit the shell. ## [](#test-user-access)Test user access Use the token’s secret to run queries on the user’s behalf. Fauna assigns and evaluates the secret’s roles and privileges, including any predicates, at query time for every query. 1. Start a new shell session using the token secret: ```cli fauna shell \ --secret ``` 2. Run the following query in editor mode: ```fql // Use the `byDateAndOffice()` index to get `Appointment` documents // with a: // - `date` value of today // - `office` value containing an `Office` collection document // with a `name` of "Acme Medical" Appointment.byDateAndOffice( Date.today(), Office.byName("Acme Medical").first() ) ``` If successful, the query returns `Appointment` documents with a `date` of today: ``` { data: [ { id: "400593000902688845", coll: Appointment, ts: Time("2099-06-13T15:55:06.320Z"), date: Date("2099-06-13"), patient: "Alice Appleseed", time: Time("2099-06-13T15:55:06.239185Z"), reason: "Allergy test", office: Office("400593000896397389") }, ... ] } ``` 3. Press Ctrl+D to exit the shell. ## [](#check-environmental-attributes)Check environmental attributes Add a membership predicate to only assign the `frontdesk` role to `Staff` users during their scheduled work hours. 1. Edit `roles.fsl` as follows: ```fsl role frontdesk { membership Staff { predicate (staff => { // Allow access after the user's // scheduled start hour in UTC. Time.now().hour >= staff.schedule[0] && // Disallow access after the user's // scheduled end hour in UTC. Time.now().hour < staff.schedule[1] }) } privileges Appointment { read } privileges Office { read } } ``` 2. Save `roles.fsl` and push the schema to Fauna: ```cli fauna schema push \ --database us/abac ``` When prompted, accept and stage the schema. 3. Check the status of the staged schema: ```cli fauna schema status \ --database us/abac ``` 4. When the status is `ready`, commit the staged schema to the database: ```cli fauna schema commit \ --database us/abac ``` The commit applies the staged schema to the database. 5. [Test the user’s access](#test-user-access). The query returns a privileges-related error. The token’s identity document doesn’t have the required `schedule` field. 6. Next, add the `schedule` field to the identity document. Start a shell session with the `admin` role: ```cli fauna shell \ --database us/abac \ --role admin ``` 7. Run the following query in editor mode: ```fql // Gets the token's identity document. let user = Staff.byName("John Doe").first() // Adds a `schedule` field to the above document. user?.update({ schedule: [8, 18] // 08:00 (8 AM) to 18:00 (6 PM) UTC }) ``` 8. Exit the admin shell session and [test the user’s access](#test-user-access). If the current UTC time is within the user’s scheduled hours, the query should run successfully. If needed, you can repeat the previous step and adjust the user’s schedule. ## [](#check-identity-based-attributes)Check identity-based attributes Add a privilege predicate to only allow the `frontdesk` role to read `Appointment` documents with the same `office` as the user’s identity document. Fauna evaluates the privilege predicate at query time for every query. This lets you treat the user’s identity document as a "living" document. If the `office` value in the user’s identity document changes, their access to `Appointment` documents also changes. 1. Edit `roles.fsl` as follows: ```fsl role frontdesk { membership Staff { predicate (staff => { Time.now().hour >= staff.schedule[0] && Time.now().hour < staff.schedule[1] }) } // If the predicate is `true`, // grant `read` access to `Appointment` collection documents. privileges Appointment { read { predicate (doc => // `Query.identity()` gets the token's `Staff` identity // document. The predicate checks that `Appointment` // document's `office` value is the same as the `Staff` // user's `office` value. doc.office == Query.identity()?.office ) } } privileges Office { read } } ``` 2. Save `roles.fsl` and push the schema to Fauna: ```cli fauna schema push \ --database us/abac ``` When prompted, accept and stage the schema. 3. Check the status of the staged schema: ```cli fauna schema status \ --database us/abac ``` 4. When the status is `ready`, commit the staged schema to the database: ```cli fauna schema commit \ --database us/abac ``` The commit applies the staged schema to the database. 5. Test user access using the steps in [Test user access](#test-user-access). The query returns an empty Set: ``` { data: [] } ``` The token’s identity document doesn’t have the required `office` value. 6. Next, add the `office` field to the identity document. Start a shell session using your `admin` key: ```cli fauna shell \ --database us/abac ``` 7. Run the following query in editor mode: ```fql // Gets the token's identity document. let user = Staff.byName("John Doe").first() // Get the first `Office` collection document with a // `name` of "Acme Medical". let acmeOffice = Office.byName("Acme Medical").first() // Updates the identity document to add a reference to // the previous `Office` document as the `office` field value. user!.update({ office: acmeOffice }) ``` 8. Exit the admin shell session and [test the user’s access](#test-user-access). The query should run successfully. 9. Run the following query in editor mode: ```fql // Use the `byDateAndOffice()` index to get `Appointment` documents // with a: // - `date` of tomorrow // - `office` containing the "Bayside Hospital" document in the // `Office` collection Appointment.byDateAndOffice( Date.today().add(1, "days"), Office.byName("Bayside Hospital").first() ) ``` Although the sample data includes `Appointment` documents with the requested `office`, the query returns an empty Set. The `frontdesk` role can only read `Appointment` documents with the same `office` as the user. ## [](#check-data-attributes)Check data attributes Update the privilege predicate to only allow the `frontdesk` role to access `Appointment` documents with a `date` of today’s date. 1. Edit `roles.fsl` as follows: ```fsl role frontdesk { membership Staff { predicate (staff => { Time.now().hour >= staff.schedule[0] && Time.now().hour < staff.schedule[1] }) } // If the predicate is `true`, // grant `read` access to `Appointment` collection documents. privileges Appointment { read { predicate (doc => doc.office == Query.identity()?.office && // Only allow access to `Appointment` documents with // today's `date` doc.date == Date.today() ) } } privileges Office { read } } ``` 2. Save `roles.fsl` and push the schema to Fauna: ```cli fauna schema push \ --database us/abac ``` When prompted, accept and stage the schema. 3. Check the status of the staged schema: ```cli fauna schema status \ --database us/abac ``` 4. When the status is `ready`, commit the staged schema to the database: ```cli fauna schema commit \ --database us/abac ``` The commit applies the staged schema to the database. ### [](#re-test-user-access)Re-test user access Use the token’s secret to run queries on the user’s behalf. The user should only be able to read `Appointment` documents with a `date` of today. 1. Start a shell session with the user’s token secret: ```cli fauna shell \ --secret ``` 2. Run the following query in editor mode: ```fql // Use the `byDateAndOffice()` index to get `Appointment` documents // with a: // - `date` value of tomorrow // - `office` value containing an `Office` collection document // with a `name` of "Acme Medical" Appointment.byDateAndOffice( Date.today().add(1, "days"), Office.byName("Acme Medical").first() ) ``` Although the sample data includes `Appointment` documents with a `date` of tomorrow, the query returns an empty Set. The user can only read `Appointment` documents with a `date` of today. 3. Run the following query in editor mode to get `Appointment` documents with a `date` of today: ```fql // Use the `byDateAndOffice()` index to get `Appointment` documents // with a: // - `date` value of today // - `office` value containing an `Office` collection document // with a `name` of "Acme Medical" Appointment.byDateAndOffice( Date.today(), Office.byName("Acme Medical").first() ) ``` The query should run successfully. ## [](#limit-access-with-udfs)Limit access with UDFs For more control, you could only allow users to access `Appointment` data using a UDF. With a UDF, you can remove direct access to `Appointment` collection documents. The UDF also lets you customize the format of returned data. In this case, we’ll exclude the `reason` field from results. First, define a `frontdeskAppts()` UDF. Then update the `frontdesk` role to: * Remove privileges for the `Appointment` collection. * Add a `call` privilege for the `frontdeskAppts()` function. * Add a privilege predicate to only allow users to call `frontdeskAppts()` with today’s date and the user’s office. 1. In the `schema` directory, create `functions.fsl` and add the following schema to it: ```fsl // Defines the `frontdeskAppts()` function. // The function gets appointments for a specific date. // Runs with the built-in `server-readonly` role's privileges. @role("server-readonly") function frontdeskAppts(date, office) { // Uses the `byDateAndOffice()` index to get `Appointment` // documents by date and office. Returned documents only // contain the `patient` and `date` fields. let appt = Appointment.byDateAndOffice(date, office) { patient, date } // Output results as an Array appt.toArray() } ``` 2. Edit `roles.fsl` as follows: ```fsl role frontdesk { membership Staff { predicate (staff => { Time.now().hour >= staff.schedule[0] && Time.now().hour < staff.schedule[1] }) } // Removed `Appointment` collection privileges // If the predicate is `true`, // grant the ability to call the `frondeskAppts` function. privileges frontdeskAppts { call { predicate ((date, office) => // Only allow users to pass today's date to the function. date == Date.today() && // Only allow users to pass their office to the function. office == Query.identity()?.office ) } } privileges Office { read } } ``` 3. Save `functions.fsl` and `roles.fsl`. Then push the schema to Fauna: ```cli fauna schema push \ --database us/abac ``` When prompted, accept and stage the schema. 4. Check the status of the staged schema: ```cli fauna schema status \ --database us/abac ``` 5. When the status is `ready`, commit the staged schema to the database: ```cli fauna schema commit \ --database us/abac ``` The commit applies the staged schema to the database. ### [](#re-test-user-access-2)Re-test user access Use the token’s secret to run queries on the user’s behalf. The user should only be able to call `frontdeskAppts()` with today’s date. 1. Start a shell session with the user’s token secret: ```cli fauna shell \ --secret ``` 2. Run the following query in editor mode. The query calls `frontdeskAppts()` with today’s date and the user’s office. ```fql // Get today's date. let date = Date.today() // Get the first `Office` collection document with a // `name` of "Acme Medical". let office = Office.byName("Acme Medical").first() // Pass tomorrow's date to the // `frondeskAppts` function call. frontdeskAppts(date, office) ``` The returned Array only contains the `patient` and `date` fields: ``` [ { patient: "Alice Appleseed", date: Date("2099-06-13") }, { patient: "Bob Brown", date: Date("2099-06-13") } ] ``` 3. Run the following query in editor mode: ```fql // Get tomorrow's date. let date = Date.today().add(1, 'days') // Get the first `Office` collection document with a // `name` of "Acme Medical". let office = Office.byName("Acme Medical").first() // Pass tomorrow's date to the // `frondeskAppts` function call. frontdeskAppts(date, office) ``` The query returns a privileges-related error. The `frontdesk` role only allows you to call `frontdeskAppts()` with today’s date. 4. Run the following query in editor mode: ```fql // Get tomorrow's date. let date = Date.today() // Get the first `Office` collection document with a // `name` of "Bayside Hospital". let office = Office.byName("Bayside Hospital").first() // Pass tomorrow's date to the // `frondeskAppts` function call. frontdeskAppts(date, office) ``` The query returns a privileges-related error. The `frontdesk` role only allows you to call `frontdeskAppts()` with user’s office. 5. Run the following query in editor mode: ```fql // Attempt to directly read `Appointment` collection documents. Appointment.byDateAndOffice( Date.today(), Office.byName("Acme Medical").first() ) ``` The query returns a privileges-related error. The `frontdesk` role does not grant direct access to `Appointment` collection documents. # Sample apps This section links to Fauna sample apps. Sample apps show how to use Fauna in the context of a full application. ## [](#e-commerce-sample-apps)E-commerce sample apps The following e-commerce sample apps show you to use a Fauna and a Fauna driver for a specific programming language. The app source code includes comments that highlight best practices for the driver and FQL queries. ![JavaScript](../_images/drivers/logos/javascript.svg) [JavaScript](https://github.com/fauna/js-sample-app) ![Python](../_images/drivers/logos/python.svg) [Python](https://github.com/fauna/python-sample-app) ![C#](../_images/drivers/logos/csharp.svg) [.NET/C#](https://github.com/fauna/dotnet-sample-app) ![Java](../_images/drivers/logos/java.svg) [Java](https://github.com/fauna/java-sample-app) ## [](#cdc-sample-apps)CDC sample apps The following sample apps show how you can use Fauna’s [CDC features](../../learn/cdc/). [Event streaming sample app](streaming/) A real-time chat application in JavaScript using Fauna event streams. [Event feeds sample app](event-feeds/) A Python application that uses event feeds to track changes to a database. The app uses an AWS Lambda function to send events for related changes to another service. # Event feeds sample app This sample application demonstrates how to use [event feeds](../../../learn/cdc/) to return paginated events from a collection. The application uses Fauna’s [Python driver](../../drivers/py-client/) to retrieve the events. The sample app uses AWS Lambda function to poll for events every 10 minutes. It then sends any fetched events to another service. ## [](#prerequisites)Prerequisites * Knowledge of Python * Knowledge of FQL and [event feeds](../../../learn/cdc/) * Basic understanding of AWS Lambda serverless functions * AWS account * AWS SAM CLI installed ## [](#learning-objectives)Learning Objectives * How to use event feeds in a serverless application ## [](#overview-of-event-feeds)Overview of event feeds An [event source](../../../learn/cdc/) tracks changes to a specific Set of documents or document fields, defined by an FQL query, in the database. Event feeds are asynchronous requests that poll an event source for paginated events. Imagine you have a e-commerce operation. Every time a product is low in stock, you want to send a notification to your team. You can use an event feed to track the stock levels of your products. When the stock level of a product falls below a certain threshold, the event feed sends an event to your warehouse team. Let’s say you check the stock levels every 10 minutes. The event feed will return all the events from the previous 10 minutes. ## [](#configure-fauna)Configure Fauna ### [](#create-a-fauna-database)Create a Fauna database Before you start, you need to create a Fauna database and collection. For this sample application, you can use the demo data provided by Fauna. 1. Log in the [Fauna Dashboard](https://dashboard.fauna.com/). 2. In the Dashboard, create a database with the **Demo data** option enabled. Then navigate to the database in the Dashboard. 3. In the upper left pane of the Dashboard’s Explorer page, click the demo database, and click the Keys tab. 4. Click **Create Key**. 5. Choose a **Role** of **server**. 6. Click **Save**. 7. Copy the **Key Secret**. Save the key’s secret. You’ll use it later. ### [](#create-a-collection-to-track-event-feed-cursors)Create a collection to track event feed cursors Each time you poll an event feed, you get a top-level page cursor. You can use this cursor to get the next set of events. You can store returned cursors in a collection. 1. Create a collection named `Cursor`. ```fsl collection Cursor { name: String value: String? // Ensure each `Cursor` document // has a unique name. unique [.name] index byName { terms [.name] } } ``` 2. Create a document in the `Cursor` collection with the name `ProductInventory`. ```fql Cursor.create({name: "ProductInventory"}) ``` ## [](#create-a-serverless-project)Create a serverless project 1. Run the following command to create a new serverless project: ```bash sam init ``` 2. Choose the `1` option for the `AWS Quick Start Templates` and then choose the `python3.10` runtime. 3. Choose the `hello_world` template. 4. The folder structure of the project will look like this: ```bash sam-python-function/ ├── README.md ├── events │ └── event.json ├── hello_world │ ├── __init__.py │ ├── app.py # Main Lambda handler file │ └── requirements.txt # Dependencies for your Lambda function ├── template.yaml # SAM template file └── tests └── unit └── test_handler.py ``` ## [](#write-code-for-the-serverless-function)Write code for the serverless function 1. Navigate to the project directory and open the `hello_world/requirements.txt` file. 2. Add fauna driver as a dependency: ```bash fauna ``` 3. Navigate inside the Lambda function directory and run the following command to install the dependencies: ```bash cd hello_world pip install -r requirements.txt -t . ``` 4. Next, add the following code to the `hello_world/app.py` file: ```python import json import time from fauna import fql from datetime import datetime, timedelta from fauna.client import Client, FeedOptions def lambda_handler(e, context): client = Client() # Get the previous feed cursor if it exists cursor = None options = None cursor_data = client.query(fql('Cursor.byName("ProductInventory").first()')) cursor = cursor_data.data.value if cursor_data.data.value else None # If no cursor exists, capture all events from previous 10 minutes if cursor is None: # Calculate timestamp for 10 minutes ago ten_minutes_ago = datetime.now() - timedelta(minutes=10) # Convert to microseconds start_ts = int(ten_minutes_ago.timestamp() * 1_000_000) options = FeedOptions( start_ts=start_ts ) feed = client.feed(fql('Product.where(.stock < 25).eventSource()'), options) for page in feed: for event in page: event_type = event['type'] if event_type == 'add': # Do something on add print('Add event: ', event) elif event_type == 'update': # Make an API call to another service on event # (i.e. email or slack notification) print('Update event: ', event) elif event_type == 'remove': # Do something on remove print('Remove event: ', event) # Store the cursor of the last page cursor = page.cursor # Store the cursor in the database cursor_update = client.query(fql(''' Cursor.byName("ProductInventory").first()!.update({ value: ${cursor} }) ''', cursor=cursor)) print(f'Cursor updated: {cursor}') return { "statusCode": 200, "body": json.dumps({ "message": "Event feed processed successfully" }) } ``` 5. Save the file. 6. Open the `template.yaml` file. This file defines the AWS resources that your serverless application will use. 7. Add the following code to the `template.yaml` file: ```yaml AWSTemplateFormatVersion: '2010-09-09' Transform: AWS::Serverless-2016-10-31 Description: > sam-python-function Globals: Function: Timeout: 3 Resources: FeedFunction: Type: AWS::Serverless::Function Properties: CodeUri: hello_world/ Handler: app.lambda_handler Runtime: python3.10 Events: # Schedule to run every 10 minutes FeedFunctionSchedule: Type: Schedule Properties: Schedule: rate(10 minutes) # AWS CloudWatch Events schedule Name: "FeedEvery10Minutes" # Optional: Name of the rule Description: "Trigger hello_world Lambda every 10 minutes" ``` 8. Save the file. 9. Change directory to the root of your project and run the following command to deploy the serverless function: ```bash sam deploy --guided ``` 10. Follow the prompts to deploy the function. 11. Once the function is deployed, you can view the logs in the AWS CloudWatch console. ## [](#add-fauna-key-to-the-lambda-function-environment)Add Fauna key to the Lambda function environment Add the Fauna key you generated earlier to the Lambda function environment using the AWS CLI. 1. Run the following command to add the FAUNA\_SECRET environment variable to the Lambda function: ```bash aws lambda update-function-configuration \ --function-name \ --environment Variables="{FAUNA_SECRET=}" ``` ## [](#testing)Testing The Lambda function will now automatically trigger every 10 minutes. The function will poll for events that happened in the past 10 minutes. You can verify this in the CloudWatch Logs. ### [](#verify-the-event-feed)Verify the event feed Run the following FQL query to update the stock of a product to less than 25: ```fql Product.all().first()!.update({ stock: 10 }) ``` After running the query, you should see the event in the CloudWatch logs the next time the Lambda function runs. ![Cloudwatch logs showing the event feed](../../_images/sample-apps/event-feed-logs.png) ## [](#cleanup)Cleanup To avoid incurring charges, delete the resources you created in this tutorial. 1. Run the following command to delete the resources: ```bash sam delete ``` # Event streams sample app This reference application show how you can use Fauna event streams to build real-time apps. You can use it as a starting point for your own app. The complete source code can be found on [GitHub](https://github.com/fauna-labs/chat-app-streaming). ## [](#prerequisites)Prerequisites * Knowledge of Javascript, React, and Next.js * A Fauna account * Knowledge of FQL ## [](#learning-objectives)Learning Objectives * How to use event streams ## [](#overview-of-event-streams)Overview of event streams Fauna’s event stream feature can manage real-time data processing in your applications. You can create and subscribe to event sources for collections, indexes, or a single document. To create an event source, write a query defining the documents to watch and convert it to an event source using [`set.eventSource()`](../../../reference/fql-api/set/eventsource/) or [`set.eventsOn()`](../../../reference/fql-api/set/eventson/). The following example creates an event source on a collection: ```fql Product.all().eventSource() ``` In the example you are you subscribing to a collection called `Product`. When a new document is added, updated or deleted in the `Product` collection, the event source emits an event. The following is another example of creating an event source using the `.eventsOn()` method: ``` Product.all().eventsOn(.price) ``` `.eventsOn()` only creates an event when changes are made to the defined fields. In the example, it is watching for changes to the `price` field in the `Products` collection. You can also subscribe to a single document: ``` let product = Product.where(.name == "cups").first() Set.single(product).eventSource() ``` ## [](#set-up-the-sample-application)Set up the sample application 1. Clone the repository from GitHub: ```bash git clone https://github.com/fauna-labs/chat-app-streaming ``` 2. Install the dependencies: ```bash cd chat-app-streaming npm install ``` 3. Next configure Fauna database. If you haven’t already, log in to Fauna using the Fauna CLI: ```cli fauna login ``` 4. Use the Fauna CLI to create a new database: ```cli # Replace 'us' with your preferred region group: # 'us' (United States), 'eu' (Europe), or `global`. fauna database create \ --name chat_app \ --database us ``` 5. Push the schema to Fauna: ```cli fauna schema push \ --database us/chat_app \ --dir ./schema ``` When prompted, accept and stage the schema. 6. Check the status of the staged schema: ```cli fauna schema status \ --database us/chat_app ``` 7. When the status is `ready`, commit the staged schema to the database: ```cli fauna schema commit \ --database us/chat_app ``` When prompted, agree and commit the changes. The commit applies the staged schema to the database. 8. Open a shell session to run queries on your `chat_app` database: ```cli fauna shell \ --database us/chat_app ``` 9. Create a new key for the `UnAuthenticatedRole`: ```fql Key.create({ role: 'UnAuthenticatedRole' }) ``` ``` { id: "395112986085163081", coll: Key, ts: Time("2099-04-14T04:12:36.920Z"), secret: "fn...", role: "UnAuthenticatedRole" } ``` 10. Press Ctrl+D to exit the shell. 11. Create a `.env` file in the root directory of your project and save the generated key and the Fauna endpoint: ```bash NEXT_PUBLIC_FAUNA_UNAUTHENTICATED_SECRET= NEXT_PUBLIC_FAUNA_ENDPOINT=https://db.fauna.com ``` 12. Run the application: ```bash npm run dev ``` 13. Open the app in your browser at [http://localhost:3000](http://localhost:3000) ## [](#review-the-application-code)Review the application code ### [](#viewing-rooms-in-real-time)Viewing rooms in real time The application is built using Next.js. In the home page [/src/app/page.js](https://github.com/fauna-labs/chat-app-streaming/blob/main/src/app/page.js), users can create chat rooms and join existing conversations. The code uses event streams to subscribe to real-time changes to the `Room` collection: ```javascript // ... Rest of the component's code const response = await client.query(fql`Room.all().eventSource()`); const eventSource = response.data; if (!streamRef.current) { streamRef.current = await client.stream(eventSource); for await (const event of streamRef.current) { switch (event.type) { case "start": console.log("Stream start", event); break; case "update": case "add": setExistingRooms(prevRooms => { const existingRoom = prevRooms.find(room => room.id === event?.data.id); return existingRoom ? prevRooms : [...prevRooms, event?.data]; }); break; case "remove": console.log("Stream remove:", event); // Handles removal of rooms break; case "error": console.log("Stream error:", event); // Handles errors break; } } // Close the stream when the component unmounts return () => { streamRef.current.close(); }; } ``` ### [](#real-time-messaging)Real-Time Messaging The core of a real-time chat application is its messaging functionality. You can find the code for chat functionality in the [file](https://github.com/fauna-labs/chat-app-streaming/blob/main/src/app/room/%5B…​id%5D/page.js): ```javascript const startMessageStream = async () => { try { const response = await streamClient.query(fql` let roomRef = Room.byId(${id}) Message.byRoomRef(roomRef).eventSource() `); const eventSource = response.data; if (!messageStream.current) { messageStream.current = await streamClient.stream(eventSource) .on("start", event => { console.log("Stream start", event); getExistingMessages(); }) .on("add", event => { console.log("Stream add", event); setMessages(prevMessages => { const existingMessage = prevMessages.find(msg => msg.id === event?.data.id); return existingMessage ? prevMessages : [...prevMessages, event?.data]; }); }) .on('error', event => { console.log("Stream error:", event); }); messageStream.current.start(); } } catch (error) { console.error("Error fetching data:", error); } }; ``` The code snippet shows the core functionality of setting up a stream to monitor changes in the `Message` collection for a room with the given `id`. ### [](#user-authentication)User authentication The sample app uses Fauna ABAC for user authentication. If you cloned the application from GitHub and ran the setup steps, you already have authentication set up. To learn more about Fauna authentication, follow the [Build an end-user authentication system](../../tutorials/auth/) guide. You can find the app’s authentication logic in [src/app/authenticationform/page.js](https://github.com/fauna-labs/chat-app-streaming/blob/main/src/app/authenticationform/page.js) file: ```javascript const handleAuth = async (e) => { e.preventDefault(); if (isLoginView) { try { const result = await client.query(fql` Login(${username}, ${password}) `) const userInfo = { username: result?.data.user.username, id: result?.data.user.id, key: result?.data.cred.secret, }; setCookieWithTTL("chat-loggedin", JSON.stringify(userInfo), 1440 * 60 * 1000); router.push('/'); } catch (error) { console.log(error) } } else { try { const result = await client.query(fql` let user = User.byUsername(${username}).isEmpty() if(user == true) { Signup(${username}, ${password}) } else { let message = "This username is already taken, select another" message } `) if (result.data == "This username is already taken, select another") { alert("This username is already taken, select another"); setUsername('') } else { setIsLoginView(true); setUsername(''); setPassword(''); alert('Account created, please login now') } } catch (error) { console.log(error) } } }; ``` # Workshops AWS Workshop ![AWS Fauna Workshop](../_images/workshops/aws-workshop.png) Build a serverless REST API using Fauna and AWS Serverless Services, including Amazon API Gateway AWS Lambda, and Amazon CloudWatch. [Start Workshop](aws/) Cloudflare Workshop ![Cloudflare Workshop](../_images/workshops/cloudflare.png) Build a serverless distributed application with Cloudflare Workers and Fauna. [Start Workshop](cloudflare/) Fastly Compute@Edge ![Fastly Compute@Edge Workshop](../_images/workshops/fastly.png) The Fastly globally distributed edge computing network paired with the geographically distributed, low-latency, and strongly consistent Fauna database, gives you the power to build REST APIs that offer speed, reliability, and scale with low operational overhead. [Start Workshop](https://sumptuous-scarecrow-263.notion.site/Ship-faster-simpler-and-more-secure-serverless-REST-APIs-with-Fastly-s-Compute-Edge-and-Fauna-f4484e72ea0844e6ba233d4caa8b8f0a/) # Workshop: Build a serverless application with AWS Lambda and Fauna In this workshop we’ll explore Fauna: what it is, how to use it, and the problems it can solve. We’ll demonstrate these through a sample app, covering Fauna’s key features and benefits. ## [](#what-is-fauna)What is Fauna? Fauna is a truly serverless document database for modern apps. It combines the flexibility of document storage with relational features, like strong consistency and joins. Fauna is truly serverless, eliminating the need for provisioning, scaling, or managing infrastructure. ### [](#why-use-fauna)Why use Fauna? Fauna is ideal for serverless environments like AWS Lambda. Fauna Query Language (FQL) is powerful and easy to learn, enabling developers to write clear queries and handle complex transactions. ### [](#core-technology-distributed-transaction-engine-dte)Core Technology: Distributed Transaction Engine (DTE) In Fauna, every query is a transaction, ensuring ACID compliance across all operations, even in globally distributed region groups. It synchronizes read/write operations across multiple regions, making Fauna a reliable choice for distributed systems and compliance needs. ### [](#security-and-multi-tenancy)Security and multi-tenancy Fauna supports both role-based access control (RBAC) and attribute-based access control (ABAC), allowing for fine-tuned permissions based on real-time context. Fauna also integrates with third-party identity providers, such as AWS Cognito and Auth0. Fauna’s hierarchical database model makes it easy to create isolated databases for multi-tenant applications. Each Fauna database can have many child databases. You can use child databases as tenants for your application. All databases, including child databases, are instantly allocated without provisioning or warmup, and each database is logically isolated from its peers with separate access controls. You can learn more about Fauna’s features and benefits in the [Why Fauna](https://fauna.com/blog/use-cases-unlocked) article. ## [](#prerequisites)Prerequisites * [Node.js](https://nodejs.org/en/download/package-manager) v20.x or later * An AWS account * [A Fauna account](https://dashboard.fauna.com/accounts/register) * [AWS CDK](https://docs.aws.amazon.com/cdk/v2/guide/getting_started.html) * [Fauna CLI](https://docs.faunadb.org/fauna/current/build/cli/v4/) * Some familiarity with AWS Lambda functions and the AWS CDK ## [](#learning-objectives)Learning Objectives * Basics of Fauna * How to create a serverless application with AWS Lambda and Fauna * How to use the Fauna JavaScript driver in a serverless application * How to deploy a serverless application with the AWS CDK ## [](#fauna-basics)Fauna basics 1. Log in to the [Fauna Dashboard](https://dashboard.fauna.com/). ### [](#create-a-new-database-using-the-fauna-dashboard)Create a new database using the Fauna Dashboard 1. In the dashboard, select the **Create Database** option. ![Create a new database](../../_images/workshops/aws/create-db.png) 2. Enter a name for the database. ![Name your database](../../_images/workshops/aws/create-db-2.png) 3. Select a **region group** for the database and click **Create**. Region groups give you control over where your data resides. Region groups make it possible to comply with data locality legislation. Fauna ensures that data is strongly consistent across all replicas in a region group. Region groups solve most of the challenges related to building and scaling a database. For example, you don’t need to handle sharding, replication, data resiliency, or failover management. ### [](#create-a-collection)Create a collection In Fauna, you store data as JSON-like documents in collections. Each collection can have its own configuration, indexes, and schema. 1. Select the database you created. ![database list](../../_images/workshops/aws/databases.png) 2. Select the **Shell** tab and run the following Fauna Query Language (FQL) query to create a new collection: ```fql Collection.create({ name: "Category" }) ``` ``` { name: "Category", coll: Collection, ts: Time("2024-11-03T19:06:12.810Z"), history_days: 0, indexes: {}, constraints: [] } ``` The query above creates the `Category` collection. We’ll explore FQL in more detail in the [FQL basics](#fauna-query-language-fql-basics) section of the workshop. ### [](#run-a-query)Run a query Now that you have a collection, run a query to list all collections in the database: ```fql Collection.all() ``` ``` { data: [ { name: "Category", coll: Collection, ts: Time("2024-11-03T19:06:12.810Z"), history_days: 0, indexes: {}, constraints: [] } ] } ``` ### [](#fauna-schema)Fauna schema You can explore your database schema from the dashboard. Fauna supports schema for collections, functions, roles, and more. 1. Select the `Category` collection under your database. 2. Click the **Schema** tab to view the collection’s schema. ![Schema](../../_images/workshops/aws/schema.png) The schema is written in [^FSL (Fauna Schema Language)](../../../reference/fsl/). You can use FSL to define your database schema and enforce constraints. We will explore FSL in more detail later in the workshop. ### [](#fauna-cli)Fauna CLI The Fauna CLI is a command-line tool that lets you interact with Fauna from your terminal. You can use it to manage databases, schema, and more. Connect to your database using the Fauna CLI: 1. If you haven’t already, log in to Fauna using the CLI: ```cli fauna login ``` 2. Run the following command in your terminal: ```cli # Replace 'us' with your preferred region group: # 'us' (United States), 'eu' (Europe), or `global`. # Replace 'mynewdb' with your database's name. fauna shell \ --database us/mynewdb ``` 3. You can now run FQL queries directly from your terminal. Run the following command to list all the collections in the database: ```fql Collection.all() ``` 4. Run the following command to create a new document in the `Category` collection: ```fql Category.create({name: "Electronics", description: "Electronic gadgets"}) ``` ## [](#set-up-the-project)Set up the project 1. A sample project is provided to help you get started. Clone the repository from GitHub: ```bash git clone https://github.com/fauna-labs/faunaproject cd ``` 2. Install the dependencies: ```bash npm install ``` ### [](#configure-fauna)Configure Fauna 1. Use the Fauna CLI to create a new database: ```cli # Replace 'us' with your preferred region group: # 'us' (United States), 'eu' (Europe), or `global`. fauna database create \ --name aws_demo \ --database us ``` 2. Create a new key for the database. You can use the `server` role for this key: ```cli fauna query "Key.create({ role: 'server' })" \ --database us/aws_demo ``` 3. Copy the key generated. You’ll need it to connect to the database. ### [](#connect-to-fauna)Connect to Fauna You can connect to Fauna using the [Fauna JavaScript driver](https://www.npmjs.com/package/fauna). The following code snippet demonstrates how to connect to Fauna using the driver and run a query: ```javascript import { Client, fql, FaunaError } from "fauna"; const client = new Client({ secret: , }); try { // Build a query using the `fql` method const collectionQuery = fql`Collection.all()`; // Run the query const response = await client.query(collectionQuery); // Run the query const response = await client.query(documentQuery); console.log(response); } catch (error) { if (error instanceof FaunaError) { console.log(error); } } finally { // Clean up any remaining resources client.close(); } ``` Replace `` with the key secret you generated earlier. In the sample code, you are provided with a function in `lambda/fauna-client.ts` file that connects and runs a query. ### [](#fauna-schema-files)Fauna schema files In the starter code, you’ll find a `schema` folder. This folder contains the FSL files that define the database schema. These schema files are used to create collections and functions in Fauna. You can think of them as infrastructure as code for your database. Let’s explore the `schema` folder: ```bash schema/ ├── Category.fsl ├── Customer.fsl ├── Order.fsl ├── OrderItem.fsl ├── Product.fsl └── functions.fsl ``` The schema folder contains `.fsl` files. Each file defines a collection or functions. For example, the `Category.fsl` file defines the `Category` collection. Let’s explore the `Category.fsl` file: ```fsl collection Category { name: String description: String compute products: Set = (category => Product.byCategory(category)) unique [.name] index byName { terms [.name] } } ``` In the following sections, we’ll break down the schema definitions. #### [](#field-definitions)Field definitions Field definitions are like columns in a table. In the `Category` collection, we have two fields: `name` and `description`. Both are of type `String`. #### [](#schemaless-document-type)Schemaless document type Collections with a schemaless or permissive document type allow ad hoc fields in collection documents. This gives you the flexibility to store any type of data in the collection. You can use this flexibility to rapidly iterate on your data model as your application evolves. The `*: Any` wildcard definition allows any field in `Category` documents. This is useful when you want to store data that doesn’t fit a strict schema. #### [](#unique-constraint)Unique constraint Unique constraints ensure a field value or a combination of field values is unique for each document in a collection. Fauna rejects write operations that don’t meet the constraint. The `name` field is unique in the `Category` collection. This means that no two documents in the `Category` collection can have the same name. You can find another example of a unique constraint in the `Product` collection schema: ```fsl collection Products { // ... rest of the schema // In this example, the `name`, `description`, and `price` // fields must be unique for each // document in the `Product` collection. unique [.name, .description, .price] } ``` #### [](#check-constraints)Check constraints Check constraints ensure document field values meet a pre-defined rule. For example, only allow writes to the `Product` collection if the value of the product’s `stock` field is greater than zero. Inside the parenthesis of the constraint is an FQL predicate that evaluates to `true`, `false`, or `null` (equivalent to `false`). The predicate can query collections, use functions, etc. ```fsl // Product.fsl collection Product { // .... rest of the schema check stockIsValid (product => product.stock >= 0) } ``` #### [](#computed-fields)Computed fields Computed fields are derived field values. They let you create new fields based on existing data and calculations that are computed when the document is read. The `compute` keyword is used to define computed fields. In the `Category` collection, the `products` field is a computed field. It returns a set of products that belong to the category. ```fsl collection Category { // ... rest of the schema compute products: Set = (category => Product.byCategory(category)) } ``` We will discuss computed fields in more detail in the data relationships section. #### [](#indexes)Indexes In Fauna, you use indexes for quick and efficient queries. Indexes are used to query documents based on specific fields. In the `Category` collection, we have an index named `byName()` that indexes the `name` field. ```fsl collection Category { // ... rest of the schema index byName { terms [.name] } } ``` ### [](#configure-fauna-project)Configure Fauna Project 1. Navigate to the `schema` directory: ```bash cd schema ``` 2. Publish the schema to the database: ```cli fauna schema push \ --database us/aws_demo ``` 3. When prompted, accept and stage the schema. 4. Check the status of the staged schema: ```cli fauna schema status \ --database us/aws_demo ``` 5. When the status is `ready`, commit the staged schema to the database: ```cli fauna schema commit \ --database us/aws_demo ``` The above command will push the schema files to the database and create the collections and functions defined in the schema. ## [](#fauna-query-language-fql-basics)Fauna Query Language (FQL) basics FQL is Fauna’s native query language. In this section, we’ll briefly cover the basics of FQL. Connect to your Fauna database using the Fauna shell and run the following queries: ### [](#create-documents)Create Documents Create a new document in the `Product` collection: ```fql Product.create({ name: "Laptop", description: "A high-performance laptop", price: 1000, stock: 100, category: Category.byName("Electronics").first() }) ``` ### [](#read-documents)Read Documents 1. Open the Fauna shell in your terminal. ```cli fauna shell \ --database us/aws_demo ``` 2. Type **.editor** and press **Enter** to open the editor mode in the shell. 3. Run the following query to get all the documents in the `Category` collection: ```fql Category.all() { name, description } ``` 4. Note that you can define the fields you want to return in the query. In this case, we are returning the `name` and `description` fields for each document in the `Category` collection. 5. You can also retrieve a single document by its id and set it to a variable: ```fql let cat = Category.byId("CATEGORY_DOCUMENT_ID") cat ``` ### [](#query-documents-by-index)Query Documents by Index For efficient querying, you can use indexes in Fauna. The following query retrieves documents from the `Category` collection using the `byName` index: ```fql Category.byName("Electronics") ``` Learn more about [Index](../../../learn/data-model/indexes/). ### [](#update-documents)Update Documents Update the `stock` field of a document in the `Product` collection: ```fql let product = Product.byName("Laptop").first() product!.update({stock: 50}) ``` ### [](#delete-documents)Delete Documents Delete a document in the `Product` collection: ```fql let product = Product.byName("Laptop").first() product!.delete() ``` ## [](#data-relationships)Data Relationships Fauna gives you the flexibility of a document database with the power of relational databases. You can define relationships such as one-to-one, one-to-many, and many-to-many between documents. In the sample project, we have the following relationships between collections: * A `Category` can have many `Products`. * A `Product` belongs to one `Category`. * A `Customer` can have many `Orders`. * An `Order` can have many `OrderItems`. These relationships are defined in the schema files. Let’s explore the `Product.fsl` file: ```fsl collection Product { // ... rest of the schema code category: Ref stock: Int // ... rest of the schema code } ``` In the `Product` collection, the `category` field is a reference to the `Category` collection. This establishes a one-to-many relationship between `Category` and `Product`. The following code snippet demonstrates how to create a new product and associate it with a category: ```fql Product.create({ name: "Laptop", description: "A high-performance laptop", price: 1000, stock: 100, category: Category.byName("Electronics").first() }) ``` In the code above, we create a new product named `Laptop` and associate it with the `Electronics` category. To query all products in a category run the following query: ```fql Product.byCategory(Category.byName("Electronics").first()) ``` In the sample application code, you will find well-commented code for various relationships between collections. Learn more about [Data Relationships](../../../learn/data-model/relationships/). ### [](#balancing-between-normalization-and-denormalization)Balancing between normalization and denormalization Balancing normalization and denormalization in Fauna depends on understanding your application’s access patterns, update frequency, and performance requirements. Below is a quick guideline to help make the right choice: **Normalize if:** * Data is updated frequently and needs consistency. * You are working with large or complex datasets. * Relationships are dynamic or require transactional guarantees. **Denormalize if:** * The application is read-heavy, and data is accessed together frequently. * The embedded data is relatively static and fits within Fauna’s document size limits. ## [](#explore-the-sample-app)Explore the sample app The sample app provided uses AWS Lambda functions as REST APIs to interact with the Fauna database. The app uses Fauna JavaScript driver to connect to the database and perform CRUD operations. Under the `lambda` folder, you’ll find well-documented code for each Lambda function. The main application logic is in these files. These lambda functions are triggered by API Gateway endpoints. ### [](#application-logic-in-aws-lambda-functions)Application logic in AWS Lambda functions Let’s explore the code in `lambda/getProducts.ts` file. This Lambda function is executed when the `/products` endpoint is called. ```javascript import { APIGatewayProxyEvent, APIGatewayProxyResult } from 'aws-lambda'; import { faunaClient } from './fauna-client'; // Adjust this import based on your project's structure import { fql, Page } from 'fauna'; import { Product } from './models/products.model'; // Adjust this import based on your project's structure export const handler = async (event: APIGatewayProxyEvent): Promise => { const queryParams = event.queryStringParameters || {}; const { category, nextToken = undefined, pageSize = '10' } = queryParams; const pageSizeNumber = Number(pageSize); try { const queryFragment = category === undefined ? fql`Product.sortedByCategory().pageSize(${pageSizeNumber})` : fql`Product.byCategory(Category.byName(${category}).first()).pageSize(${pageSizeNumber})`; const query = fql` ${queryFragment} .map(product => { let product: Any = product; let category: Any = product.category; { id: product.id, name: product.name, price: product.price, description: product.description, stock: product.stock, category: { id: category.id, name: category.name, description: category.description }, } }) `; const { data: page } = await faunaClient.query>( nextToken ? fql`Set.paginate(${nextToken})` : query ); return { statusCode: 200, body: JSON.stringify({ results: page.data, nextToken: page.after }), }; } catch (error: any) { console.error('Error fetching products:', error); return { statusCode: 500, body: JSON.stringify({ message: 'Internal server error' }), }; } }; ``` This code defines an AWS Lambda handler function that fetches a paginated list of products from a Fauna database. It checks for optional query parameters: * **Category**: If provided, only products in that category are fetched; otherwise, all products are retrieved. * **`nextToken`**: Used to get the next page of results if there are more than the page size. * **`pageSize`**: Sets the number of products per page (default is 10). The `queryFragment` builds a Fauna query based on whether a category is specified. The final query maps each product to include detailed information (id, name, price, description, stock, and category info). The query result returns a JSON response with the products and an optional `nextToken` for pagination. If an error occurs, it logs the error and returns a 500 status with an error message. #### [](#pagination)Pagination The Lambda function uses Fauna’s `Set.paginate()` function to handle pagination. The `nextToken` query parameter is used to fetch the next page of results. The `pageSize` parameter sets the number of products per page. [`Set.paginate()`](../../../reference/fql-api/set/static-paginate/) provides more information on pagination. You can explore the code for the other Lambda functions in the `lambda` directory. The functions handle creating, updating, and deleting products, fetching products by price, and managing customer carts. The code is well-documented and easy to follow. ## [](#infrastructure-as-code-with-aws-cdk)Infrastructure as code with AWS CDK The sample project uses AWS CDK to set up the infrastructure for the Lambda functions and API Gateway. The `lib/faunaproject-stack.ts` file contains the CDK stack definition. ```typescript import * as cdk from 'aws-cdk-lib'; import { Construct } from 'constructs'; import * as lambda from 'aws-cdk-lib/aws-lambda'; import * as apigateway from 'aws-cdk-lib/aws-apigateway'; import * as dotenv from 'dotenv'; dotenv.config(); export class NewprojectStack extends cdk.Stack { constructor(scope: Construct, id: string, props?: cdk.StackProps) { super(scope, id, props); // Define the getProducts Lambda function const getProductsLambda = new lambda.Function(this, 'GetProductsFunction', { functionName: 'GetProducts', runtime: lambda.Runtime.NODEJS_20_X, code: lambda.Code.fromAsset('lambda'), handler: 'getProducts.handler', environment: { FAUNA_SECRET: process.env.FAUNA_SECRET || '', }, }); // Define the createProduct Lambda function const createProductLambda = new lambda.Function(this, 'CreateProductFunction', { functionName: 'CreateProduct', runtime: lambda.Runtime.NODEJS_20_X, code: lambda.Code.fromAsset('lambda'), handler: 'createProduct.handler', environment: { FAUNA_SECRET: process.env.FAUNA_SECRET || '', // Add necessary environment variables }, }); // Define the updateProduct Lambda function const updateProductLambda = new lambda.Function(this, 'UpdateProductFunction', { functionName: 'UpdateProduct', runtime: lambda.Runtime.NODEJS_20_X, code: lambda.Code.fromAsset('lambda'), handler: 'updateProduct.handler', environment: { FAUNA_SECRET: process.env.FAUNA_SECRET || '', // Add necessary environment variables }, }); // Define the getProductsByPrice Lambda function const getProductsByPriceLambda = new lambda.Function(this, 'GetProductsByPriceFunction', { functionName: 'GetProductsByPrice', runtime: lambda.Runtime.NODEJS_20_X, code: lambda.Code.fromAsset('lambda'), handler: 'getProductsByPrice.handler', environment: { FAUNA_SECRET: process.env.FAUNA_SECRET || '', // Add necessary environment variables }, }); // Define the getCustomerCart Lambda function const getCustomerCartLambda = new lambda.Function(this, 'GetCustomerCartFunction', { functionName: 'GetCustomerCart', runtime: lambda.Runtime.NODEJS_20_X, code: lambda.Code.fromAsset('lambda'), handler: 'getCustomerCart.handler', environment: { FAUNA_SECRET: process.env.FAUNA_SECRET || '',// Add necessary environment variables }, }); // Define the createOrUpdateCart Lambda function const createOrUpdateCartLambda = new lambda.Function(this, 'CreateOrUpdateCartFunction', { functionName: 'CreateOrUpdateCart', runtime: lambda.Runtime.NODEJS_20_X, code: lambda.Code.fromAsset('lambda'), handler: 'createOrUpdateCart.handler', environment: { FAUNA_SECRET: process.env.FAUNA_SECRET || '',// Add necessary environment variables }, }); // Create API Gateway const api = new apigateway.RestApi(this, 'CrudApi', { restApiName: 'Fauna Workshop Service', description: 'This service handles CRUD operations.', }); // GET /products - Get all products const products = api.root.addResource('products'); const getProductsIntegration = new apigateway.LambdaIntegration(getProductsLambda); products.addMethod('GET', getProductsIntegration); // POST /products - Create a new product const createProductIntegration = new apigateway.LambdaIntegration(createProductLambda); products.addMethod('POST', createProductIntegration); // PATCH /products/{id} route for PATCH const product = products.addResource('{id}'); const updateProductIntegration = new apigateway.LambdaIntegration(updateProductLambda); product.addMethod('PATCH', updateProductIntegration); // GET /products/by-price - Get products by price range const byPrice = products.addResource('by-price'); const getProductsByPriceIntegration = new apigateway.LambdaIntegration(getProductsByPriceLambda); byPrice.addMethod('GET', getProductsByPriceIntegration); // GET /customers/{id}/cart - Get customer's cart const customers = api.root.addResource('customers'); const customer = customers.addResource('{id}'); const cart = customer.addResource('cart'); const getCustomerCartIntegration = new apigateway.LambdaIntegration(getCustomerCartLambda); cart.addMethod('GET', getCustomerCartIntegration); // /customers/{id}/cart route for POST const createOrUpdateCartIntegration = new apigateway.LambdaIntegration(createOrUpdateCartLambda); cart.addMethod('POST', createOrUpdateCartIntegration); /** END */ } } ``` This code defines an AWS CDK stack that sets up an API with Lambda functions to handle CRUD operations for products and customer carts in a Fauna database. Six Lambda functions are created for specific operations: * `getProductsLambda`: Fetches all products. * `createProductLambda`: Creates a new product. * `updateProductLambda`: Updates an existing product by ID. * `getProductsByPriceLambda`: Gets products within a specified price range. * `getCustomerCartLambda`: Retrieves a customer’s cart by ID. * `createOrUpdateCartLambda`: Adds or updates a customer’s cart. The code also creates an API Gateway to provide HTTP endpoints for the Lambda functions * `GET /products`: Retrieves all products. * `POST /products`: Creates a new product. * `PATCH /products/{id}`: Updates a product by ID. * `GET /products/by-price`: Fetches products by price. * `GET /customers/{id}/cart`: Gets a customer’s cart by ID. * `POST /customers/{id}/cart`: Creates or updates a customer’s cart. It loads environment variables (e.g., `FAUNA_SECRET`) from the Lambda functions' environment variables. Use AWS Secrets Manager or Parameter Store to securely store and manage secrets for production applications. ## [](#deploy-the-project)Deploy the project The project uses AWS CDK to deploy the Lambda functions and API Gateway. You can deploy the project using the following command: ```bash cdk deploy ``` To delete the AWS resources created by the project, run: ```bash cdk destroy ``` # Workshop: Build serverless edge applications with Cloudflare Workers and Fauna In this workshop, you’ll learn how to build a distributed serverless application using [Cloudflare Workers](https://developers.cloudflare.com/workers/) and Fauna. The example app uses Fauna and [Cloudflare Workers Cron Triggers](https://developers.cloudflare.com/workers/configuration/cron-triggers/) to: * Fire a Workers function that connects to Fauna. The function confirms the Worker has access to a cursor value. The cursor records the last time the function was run. * Connect to a Fauna [event feed](../../../learn/cdc/) of e-commerce orders, getting events since the last cursor. * Use a webhook to push order information from the event feed to a fulfillment application. * Write the resulting cursor value to Fauna for use in the next Workers function run. ## [](#why-cloudflare-workers-and-fauna)Why Cloudflare Workers and Fauna? Cloudflare Workers are serverless functions that run on Cloudflare’s edge network. They are written in [JavaScript, TypeScript, Rust, or Python](https://developers.cloudflare.com/workers/languages/) and can be used to build serverless applications that run close to your users, reducing latency and improving performance. Fauna is a globally distributed, low-latency, strongly consistent, and serverless database. It is designed to work well with serverless functions like Cloudflare Workers and provides a powerful and flexible data platform for building modern applications. Fauna is a database delivered as an API. Fauna is globally distributed. Your data is always close to your users, reducing latency and improving performance. By combining Cloudflare Workers and Fauna, you can build serverless applications that are fast, reliable, and scalable, with low operational overhead. ## [](#prerequisites)Prerequisites * A Cloudflare account * A [Fauna account](https://dashboard.fauna.com/) * [Node.js](https://nodejs.org/en/download/package-manager) v20.x or later installed on your local machine * [Fauna CLI v4](../../cli/v4/) installed on your machine with an access key * Some familiarity with Cloudflare Workers and Fauna ## [](#creating-the-cloudflare-worker)Creating the Cloudflare Worker 1. Install Cloudflare Wrangler: ```bash npm install -g wrangler@latest ``` Ensure Cloudflare Wrangler is v3.88 or higher. 2. Create a new Cloudflare Worker project: ```bash npm create cloudflare -- my-fauna-worker cd my-fauna-worker ``` When running `npm create cloudflare ...`, you’re prompted with multiple questions. When asked which example and template, choose the "Hello World" one. For language, choose "TypeScript". When it asks if you want to deploy your application, select "No". 3. Set up [cron triggers](https://developers.cloudflare.com/workers/configuration/cron-triggers/) in `wrangler.toml`: ```cli cat <> wrangler.toml [triggers] crons = [ "*/30 * * * *" ] EOF ``` 4. Using the wrangler CLI, deploy the Worker to register it in Cloudflare. ```bash wrangler deploy ``` 5. Open the newly created project in your favorite code editor. ## [](#create-a-fauna-database)Create a Fauna Database You can create a new database from the Fauna Dashboard or using the [Fauna CLI](../../cli/v4/). For this workshop, we will create a new database using the [Fauna Dashboard](https://dashboard.fauna.com). 1. Log into the Fauna Dashboard. 2. Choose the **Region group** you want the database to be created in, and click the **+** button for that region. ![Create a Fauna Database](../../_images/workshops/create_database_dashboard.png) 3. Name the database **mydb**, enable the **Use demo data** option, enable/disable **Backups**, and click **Create**. ![Configure the Fauna Database](../../_images/workshops/database_options.png) ## [](#modify-the-database-schema-to-add-a-locking-mechanism)Modify the database schema to add a locking mechanism Next, modify the schema to add new collections, roles, and user-defined functions (UDFs) to the demo database. The app uses the `Lock` collection to ensure that only one Cloudflare Worker function is processing the Fauna event feed at a time. 1. Create a schema directory: ```bash mkdir schema && cd schema ``` 2. If you haven’t already, log in to Fauna using the CLI: ```bash fauna login ``` 3. Pull the existing demo schema: ```bash fauna schema pull --database us-std/mydb ``` 4. Add the following to the end of `collections.fsl`: ```fsl ... collection Cursor { name: String cursorValue: String? *: Any unique [.name] index byName { terms [.name] } } collection Lock { name: String cursorInfo: Ref? locked: Boolean = false *: Any unique [.name] document_ttls true index byName { terms [.name] } } ``` `Lock` documents have a `locked` field. The field ensures only **one** Cloudflare Worker function is processing the order event feed at a time. `Cursor` collection documents have a `cursorValue` field that stores the cursor for the last event feed page processed. The next Worker function call [gets events after this cursor](../../../learn/cdc/#feed-cursor). The setup is serverless and region-agnostic. 5. Add the following to the end of `functions.fsl`: ```fsl role locksAndCursors { privileges Cursor { create read write } privileges Lock { create read write } } @role(locksAndCursors) function lockAcquire(name, identifier) { let lock = Lock.byName(name + "Lock")!.first() // If locked is true, then we need if we are the ones who own the lock if ((lock != null) && (lock!.locked == true)) { // if the lock document exists and is locked by someone else, return the value of locked // and the identity of who has it locked currently. lock {locked, identity, test: "dfd"} } else if ((lock != null) && (lock!.locked == false)) { // If the lock document exists and is not locked, lock it, set a TTL on the document, and the cursor. lock!.update({locked: true, identity: identifier, ttl: Time.now().add(6000, "seconds"), cursor: Cursor.byName(name + "Cursor")!.first()}) } else if (lock == null) { //if the document doesn't exist, create it, and lock set it to locked by the calling function. Lock.create({ name: name + "Lock", locked: true, identity: identifier, lastProcessedTimestamp: Time.now(), cursor: Cursor.byName(name + "Cursor")!.first(), ttl: Time.now().add(600, "seconds") }) } } @role(locksAndCursors) function lockUpdate(name, cursorValue) { let lock = Lock.byName(name + "Lock")!.first() // If the document is locked, set `locked` to false, update the `lastProcessedTimestamp`, and remove `ttl` field if (lock != null && lock!.locked == true) { // if lock!.update({locked: false, lastProcessedTimestamp: Time.now(), ttl: null, identity: null}) Cursor.byId(lock!.cursor.id)!.update({value: cursorValue}) } else { //if nothing else, abort. abort("Invalid document id or lock not set.") } } ``` The schema defines a role and two [UDFs](../../../learn/schema/user-defined-functions/). The `lockAcquire()` UDF acquires a lock and cursor. The `lockUpdate()` releases the lock after processing the event feed. The `lockAndCursors` role grants the minimum privileges required to call the UDFs and perform CRUD operations on the `Lock` and `Cursor` collections. Why use UDFs? The Worker function could handle the functionality of the UDFs. However, there are some advantages to keeping this logic in the database: * The data is kept inside the database. Only required data is returned to the calling client. This leads to lower costs and better performance. * The worker function can be smaller and less complicated. The core, repeatable logic is run inside the database. * Fauna’s query language, FQL, is strongly consistent. If you instead read data from the database, run logic in the Worker function, and write that data back to the database, the data could have changed. Or you’d require a blocking lock on that data for a relatively extended period. With a UDF, all operations and logic are in the same strongly consistent transaction, performed inside the database. 6. Push the schema to Fauna: ```bash fauna schema push --database us-std/mydb ``` When prompted, accept and stage the schema. 7. Check the status of the staged schema: ```bash fauna schema status --database us-std/mydb ``` 8. When the status is `ready`, commit the staged schema to the database: ```bash fauna schema commit --database us-std/mydb ``` The commit applies the staged schema to the database. ## [](#integrating-fauna-with-cloudflare-workers)Integrating Fauna with Cloudflare Workers You can integrate Fauna using the Cloudflare dashboard or the Wrangler CLI. For this workshop, use the Cloudflare dashboard and the native Fauna integration. 1. Open the Cloudflare dashboard and navigate to the **Workers & Pages** section. 2. Select the **my-fauna-worker** Worker you created earlier. 3. Select the **Integrations** tab. ![Configure Fauna](../../_images/workshops/cf-config.png) 4. Under **Fauna**, select **Add Integration** and authenticate with your existing Fauna account. 5. When prompted, select the Fauna database you created earlier. 6. Select a database security role. For this workshop, you can select the **server** role. For a production deployment, you should create a custom role before this step. ### [](#accessing-data-from-fauna-in-cloudflare-workers)Accessing data from Fauna in Cloudflare Workers You can use a Fauna [client driver](../../drivers/) to access data from Fauna in a Cloudflare Worker. Using a driver is the easiest way to interact with Fauna databases from Cloudflare Workers. Each driver is a lightweight wrapper for the [Fauna Core HTTP API](../../../reference/http/reference/core-api/). For this workshop, use the [JavaScript driver](../../drivers/js-client/#event-feeds): 1. Install the Fauna JavaScript driver in your Cloudflare Worker project. Also install the `uuid` library. ```bash npm install fauna npm install uuid ``` 2. Replace the contents of `src/index.ts` with the following: ```typescript import { Client, fql, FaunaError, FeedClientConfiguration, ServiceError } from 'fauna'; import { v4 as uuidv4 } from 'uuid'; export interface Env { FAUNA_SECRET: string; } export default { async scheduled( request: ScheduledEvent, env: Env, ctx: ExecutionContext ): Promise { // Extract the method from the request. const { method } = request; // Instatiate a Fauna client instance. const client = new Client({ secret: env.FAUNA_SECRET }); try { // There are two cursors used in the code. The first is for where in the Fauna feed to pick up from. // the second is for the document stored in Fauna that is used for locking the feed so it's only processed // one at a time by a single Worker. const myIdentifer = uuidv4().toString(); //generate a unique identifier for this function run. // Call the lockAcquire user-defined function in Fauna to get the cursor information, // lock information, and if you can lock it, append the identity. const lockDataResponse = await client.query( fql`lockAcquire("orderFulfillment", ${myIdentifer})` ); // The response from the UDF is a JSON object with a data field containing the cursor //information and stats. We need the data part only for this example. const lockData = lockDataResponse.data; // If locked is true and the identity field doesn't match, return 409. if ((lockData.locked) && ('identity' in lockData) && !(lockData.identity == myIdentifer)) { return new Response('Another Worker is processing the feed', { status: 409 }); } else if (lockData.locked == true && lockData.identity == myIdentifer) { // Got the lock. Process the Fauna event feed. const cursorValue = await client.query( fql`Cursor.byId(${lockData.cursor.id}) { cursorValue }` ); // Get the value of the cursor. let cursorVal: string | null = cursorValue.data?.cursorValue; const options = cursorVal ? { cursor: cursorVal } : undefined try { // Get an event feed for the `Order` collection. const feed = client.feed(fql`Order.all().eventSource()`, options); for await (const page of feed) { console.log("Page: ", page); // You need to make a decision here if you want to // flatten the events. This example does not. cursorVal = page.cursor; for (const event of page.events) { console.log("Event: ", event); cursorVal = event.cursor; console.log("event cursor: " + cursorVal); switch (event.type) { case "add": // Webhook to add a new order in the fulfillment system //console.log("Add event: ", event); break; case "update": // Webhook to update an order in the fulfillment system console.log("Update event: ", event); break; case "remove": // Webhook to attempt to cancel an order in the fulfillment system console.log("Remove event: ", event); break; } } // Update the cursor in Fauna. const updateCursor = await client.query( fql`Cursor.byId(${lockData.cursor.id})!.update({ cursorValue: ${page.cursor} })` ); } console.log(cursorVal); // Release the lock. await client.query( fql`lockUpdate("orderFulfillment", ${cursorVal})` ); return new Response('I got the lock and then did some stuff!', { status: 200 }); } catch (cursorError) { if (cursorError instanceof FaunaError && cursorError.message.includes("is too far in the past")) { console.warn("Cursor is too old, deleting and retrying..."); // Delete the outdated cursor document. await client.query( fql`Cursor.byId(${lockData.cursor.id})!.update({cursorValue: null})` ); // Unlock the lock document. await client.query( fql`lockUpdate("orderFulfillment", ${cursorVal})` ); } else { throw cursorError; } } } else { return new Response('There is nothing to do, something went wrong.', { status: 500 }); } } catch (error) { if (error instanceof FaunaError) { if (error instanceof ServiceError) { console.error(error.queryInfo?.summary); } else { return new Response("Error " + error, { status: 500 }); } } return new Response('An error occurred, ' + error.message, { status: 500 }); } }, }; ``` ### [](#define-data-relationships-with-fsl)Define data relationships with FSL To show off the Cloudflare Cron Trigger and the cursor and lock collections we created above, we’ll use three collections in the Fauna demo database: `Order`, `OrderItem`, and `Product`. Every order has one or more order items, and each order item has one product related to it. When creating an `OrderItem` document, it relates a product to an order. ![Order collections](../../_images/workshops/order-collections.svg) In addition, you see the `items` field in the `Order` collection. This is a [computed field](../../../reference/fsl/computed/). When you read the Order document, Fauna will return an array of all products that are part of the order. #### [](#document-relational-model)Document-relational model Fauna supports both document and relational data patterns, making it suitable for a wide range of use cases. [Learn more about Fauna’s document relational model here](https://fauna.com/blog/what-is-a-document-relational-database). In the example above, we demonstrated how to define relationships using [Fauna Schema Language (FSL)](../../../reference/fsl/). You can think of the `Cursor` and `Lock` collections as representing a typical relational model (one-to-one), where cursors are linked to locks. What makes Fauna unique is its capability to perform relational-like joins within a document-based system. Now, let’s look at a many-to-many relationship using Fauna’s document-relational capabilities. ### [](#add-orders-and-items-to-an-order)Add orders and items to an order 1. Create a document in the `Order` collection. Then add an item to the order. ```fql // Create a new order document in the Order collection, but save the order ID. let order = Order.create({ customer: Customer.byId(111), status: "processing", createdAt: Time.now() }) // Create a new order item document in the OrderItem collection and relate it to the order with the ID // from the previous step. OrderItem.create({order: order, product: Product.byName("pizza").first(), quantity: 1}) ``` As you can see, Fauna provides you SQL-like relational capabilities while maintaining the flexibility of a document-based database. The `OrderItem` document to relationships with a Product and one with the Order. Then in every order document, there is a field called `items` that contains an array of `OrderItem` documents. This is a many-to-many relationship. If you do a projection on that field, you will get an array of the items in the order because of the generated `items` field. Learn more about [data relationships in Fauna](../../../learn/data-model/relationships/). ## [](#test-the-application)Test the application ### [](#deploy-the-cloudflare-worker)Deploy the Cloudflare Worker 1. Deploy the Cloudflare Worker: ```bash wrangler deploy ``` 2. Test the Cloudflare Worker by creating a new order. ```fql // Create a new order document in the Order collection, but save the order ID. let order = Order.create({ customer: Customer.byId(111), status: "processing", createdAt: Time.now() }) // Create a new order item document in the OrderItem collection and relate it to the order with the ID // from the previous step. OrderItem.create({order: order, product: Product.byName("cups").first(), quantity: 1}) ``` You can find the full source code for this workshop in the following [GitHub repository](https://github.com/fauna-labs/cloudflare-tutorial). # Integrations Fauna integrates with several third-party services and platforms. ## [](#authentication)Authentication ![Auth0](../_images/integration/logos/auth0.png) [Auth0](auth0/) Use Auth0 to authenticate your users with Fauna. ![Amazon Cognito](../_images/integration/logos/cognito.png) [Amazon Cognito](cognito/) Use Amazon Cognito to authenticate with Fauna. ![Clerk](../_images/integration/logos/clerk.svg) [Clerk](clerk/) Use Clerk to authenticate your users with Fauna. ![Microsoft Entra](../_images/integration/logos/entra.svg) [Microsoft Entra](entra/) Use Microsoft Entra to authenticate with Fauna. ## [](#gitops)GitOps ![GitHub](../_images/integration/logos/github.svg) [GitHub](../../learn/schema/manage-schema/#github) Manage FSL schema using a CI/CD pipeline in GitHub. ![GitLab](../_images/integration/logos/gitlab.svg) [GitLab](../../learn/schema/manage-schema/#gitlab) Manage FSL schema using a CI/CD pipeline in GitLab. # Amazon Cognito integration This guide covers how to use [Amazon Cognito](https://aws.amazon.com/pm/cognito/) to authenticate with a Fauna database. Once set up, Cognito issues a JWT when end users log into your application. The JWT contains a Fauna [token](../../../learn/security/tokens/)'s authentication secret for the user in a private claim. Your application can use the secret to run queries on the user’s behalf. ## [](#before-you-start)Before you start To complete this guide, you’ll need: * An [Amazon Cognito](https://aws.amazon.com/pm/cognito/) account. * The [AWS SAM CLI](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/install-sam-cli.html). * Familiarity with AWS Lambda and AWS Cognito. ## [](#authentication-flow)Authentication flow You can’t use Cognito as an [access provider](../../../learn/security/access-providers/). Cognito JWTs don’t support `aud` claims, which are required for JWTs used as an authentication secret in Fauna. Instead, the setup uses the following authentication flow: 1. The client application sends a request to Cognito. Cognito invokes an AWS Lambda function in the token generation phase. 2. The Lambda function generates a Fauna [token](../../../learn/security/tokens/) for the end user. The function includes the token’s secret in a private `fauna_token_secret` claim in the payload of the JWT issued by Cognito. 3. Cognito returns the JWT to the client application. The client application uses the token secret to authenticate Fauna queries on the user’s behalf. ![AWS Cognito authentication flow](../../_images/integration/cognito-flow.png) ## [](#configure-fauna)Configure Fauna 1. Log in to the [Fauna Dashboard](https://dashboard.fauna.com/) and select your database. 2. In the Dashboard Shell, run the following FQL query to create a key with the `server` role: ```fql Key.create({ role: "server" }) ``` ``` { id: "404130885316640841", coll: Key, ts: Time("2099-07-22T17:08:15.803Z"), role: "server", secret: "fn..." } ``` Copy the key’s `secret`. You’ll use it later. 3. Create a collection to store [identity documents](../../../learn/security/tokens/#identity-document) for your application’s end users. Edit the collection’s schema to include an `email` field or similar identifier used to uniquely identify users. For example: ```fsl collection Customer { unique [.email] index byEmail { terms [.email] } } ``` 4. Create one or more [user-defined roles](../../../learn/security/roles/#user-defined-role). Edit the role’s schema to include the previous collection in the role’s [`membership` property](../../../learn/security/roles/#role-membership). For example: ```fsl role customer { membership Customer privileges Product { read } ... } ``` ## [](#create-an-aws-lambda-function)Create an AWS Lambda function 1. Initialize a new SAM project. In your terminal, run: ```sh sam init ``` 2. When prompted, select: * `AWS Quick Start Templates` as the template source. * `Hello World Example` as the AWS Quick Start application template. * Enter `N` (no) when asked to use the most popular runtime and package type. * `nodejs20.x` as the runtime * `Zip` as the package type * `Hello World Example` as the starter template Follow the prompts and select the `Hello World Example (Node.js 18.x)` starter template. 1. Navigate to the project directory and install the Fauna JavaScript driver. ```sh cd your-project-name npm install fauna --save ``` 2. In the project directory, open `app.mjs`. The file contains the JavaScript code for the Lambda function. Replace the file’s contents with the following: ```js import { Client, fql } from 'fauna'; const client = new Client({ secret: process.env.FAUNA_SECRET, }); export const lambdaHandler = async (event, context) => { // Get the email from event. const email = event.request.userAttributes['email']; const fauna_response = await client.query(fql` // If a Customer document with the email exists, get it. // Otherwise, create a Customer document. let user = Customer.byEmail(${email}).first() ?? Customer.create({email: ${email}}) // Create a token for the Customer document. let token = Token.create({ document: user, ttl: Time.now().add(30, 'minutes') }) // Return the Customer document's ID and the // token's secret. let payload = { userId: user!.id, token: token.secret } payload `); event.response = { "claimsOverrideDetails": { "claimsToAddOrOverride": { fauna_token_secret: fauna_response.data.token, userId: fauna_response.data.userId }, "claimsToSuppress": ["email"] } }; context.done(null, event); return event; }; ``` 3. Notice in the Lambda function code you make a query to Fauna to create an [identity document](../../../learn/security/tokens/#identity-document) for the user. The `ttl` property sets the token’s expiration time to 30 minutes. 4. Define the environment variable `FAUNA_SECRET` in the `template.yaml` file. In the `template.yaml` file, locate the `Environment` section under your Lambda function’s configuration. If it doesn’t exist, you can add it. 5. In `template.yaml`, add the `FAUNA_SECRET` environment variable. Set the variable’s value to the Fauna key secret you created earlier. ```yaml Resources: HelloWorldFunction: Type: AWS::Serverless::Function Properties: CodeUri: hello-world/ Handler: app.lambdaHandler Runtime: nodejs20.x Environment: Variables: FAUNA_SECRET: "fn..." ``` 6. Build and deploy the Lambda function by running the following commands: ```sh sam build sam deploy --guided ``` 7. Note the name of the Lambda function. You’ll use it in the next step to integrate with Cognito. ## [](#configure-aws-cognito)Configure AWS Cognito 1. Follow the [Amazon Cognito guide](https://docs.aws.amazon.com/cognito/latest/developerguide/tutorial-create-user-pool.html) to create a new user pool. 2. Note your user pool ID and client ID. You’ll use it later to integrate with your application. 3. Navigate to the user pool’s **User-pool-properties** tab and click **Add Lambda trigger**. 4. On the **Add-Lambda-trigger** page, select **Authentication > Pre token generation trigger**. 5. In the **Assign Lambda function** section, enter select the Lambda function you previously created. 6. Click **Add Lambda Trigger**. ## [](#test-user-access)Test user access Verify that the setup works: 1. Create a test user in AWS Cognito. 2. Log in to the AWS Cognito user pool using the test user’s credentials. The following Node.js sample code logs in a user using the AWS Cognito SDK: : ```js // signin.js import { CognitoUserPool, CognitoUser, AuthenticationDetails } from 'amazon-cognito-identity-js'; const config = { UserPoolId: '', ClientId: '' } const poolData = { UserPoolId: config.UserPoolId, ClientId: config.ClientId, }; const userPool = new CognitoUserPool(poolData); const cognitoUser = new CognitoUser({ Username: '', Pool: userPool, }); const authenticationDetails = new AuthenticationDetails({ Username: '', Password: '', }); cognitoUser.authenticateUser(authenticationDetails, { onSuccess: data => { console.log(data); }, onFailure: err => { console.log('Failed', err) }, newPasswordRequired: newPass => { console.log('New Pass Required', newPass) } }) ``` 3. After successful login, you should receive a `CognitoIdToken` JWT with the `fauna_token_secret` field in the payload: ```json ognitoUserSession { idToken: CognitoIdToken { jwtToken: '...', payload: { ... fauna_token_secret: 'fn...', ... } }, refreshToken: CognitoRefreshToken { ... }, accessToken: CognitoAccessToken { ... }, clockDrift: 0 } ``` 4. In your client application, use the `fauna_token_secret` to run Fauna queries on behalf of the user. # Auth0 integration This guide covers how to configure [Auth0](https://www.auth0.com/) as an [access provider](../../../learn/security/access-providers/) for a Fauna database. Once set up, end users can log in to Auth0 to create a JWT for your client application. Your application can use the JWT as an [authentication secret](../../../learn/security/authentication/) to run queries on the user’s behalf. ## [](#before-you-start)Before you start To complete this guide, you’ll need: * An [Auth0](https://www.auth0.com/) account. * A Fauna database with a [user-defined role](../../../learn/security/roles/) to use for JWTs created by Auth0. ## [](#get-the-fauna-audience)Get the Fauna audience 1. Log in to the [Fauna Dashboard](https://dashboard.fauna.com/) and select your database. 2. Select the **Access Providers** tab and click **Create Access Provider**. 3. Copy the **Audience** URL. Don’t close the Dashboard browser tab. You’ll use it later in the guide. ## [](#configure-auth0)Configure Auth0 1. In a separate browser tab, log in to [Auth0](https://www.auth0.com/). 2. In the left navigation, select **Applications > APIs**. 3. Click **\+ Create API**. 4. In **Name**, enter a name for the API, such as **fauna-my\_app**. 5. In **Identifier**, paste the copied audience URL. 6. Ensure the **RS256 Signing Algorithm** is selected. 7. Click **Create**. 8. Click the **Test** tab to display the API **Test** page. 9. In the **CURL** example, copy the `--url` value. Omit the `oauth/token` portion but include the trailing slash (`/`). Don’t close the Auth0 tab. You’ll use it later in the guide. ## [](#configure-fauna)Configure Fauna 1. In the Dashboard browser tab, enter a **Name** for the access provider, such as **Auth0**. 2. In **Issuer**, paste the copied Auth0 API URL. 3. In **JWKS endpoint**, enter the same Auth0 API URL and append `.well-known/jwks.json` to the URL. For example, if the Auth0 API URL is `https://dev—​nozpv3z.us.auth0.com/`, **JWKS endpoint** should contain `https://dev—​nozpv3z.us.auth0.com/.well-known/jwks.json`. 4. Click **Create**. 5. In the Dashboard, update the access provider’s FSL schema to include a user-defined role: ```fsl access provider Auth0 { ... // Adds a user-defined role to JWTs created by the access provider. role } ``` Don’t change the values of the `issuer` or `jwks_uri` fields. 6. Click **Save**. ## [](#test-user-access)Test user access Auth0 is now ready to create Fauna JWTs. Verify that the setup works: 1. In the Auth0 browser tab’s **Test** page, copy the curl example. 2. Run the curl request in your terminal. The output should be similar to: ```json {"access_token":"eyJhbGcIqiJSUzI1N5IsInR5cCi6IkpXVCIsImTpZCI6ilNCZTczWmFyOWpKU3h ueG44QlNTSqJ9eyJpc3MiOiJQdHRwczovL2R6di0tbm96cHYzei51cy5hdXRoMC5jb20vIiwic3ViIjo ibDZ2SlM4UXZIQzJMbWlHUmFPVGlFMTZnaXZ1dWZSMjJAY2xpZW50cyIsImF1ZCI6Imh0dHBzOi8vZGI uZmF1bmEuY29tL2RiL3l4eGY1eDl3MXlieW4iLCJpYXQiOjE2MDU1MDI2NDgsImV4cCI6MTYwNTU4OTA 0OCwiYXpwIjoibDZ2SlM4UXZIQzJMbWlHUmFPVGlFMTZnaXZ1dWZSMjIiLCJndHkiOiJjbGllbnQtY3J lZG.udGlhbHMif6 pdnzxME8gaQkyxsWhurgVzQcakcnMRUJEGcb83f_lgd0tWaE-VcFcfb-SXLCFX3IcJkls9woQVcFM91 UCHRN_qSKjEzB1vOrFqQ73FSq33dLviGM_8E195R_zJVmCsb__ADhQCaWTYM-vO8ZSA7lC2WzVejLAg CJhOXwP7WGeG_FDfqVDM0InaJdVOoUwXF4SzZ00DVjJxSoKnsiRgwpPyaV3rGAQGVlijyYe1mea7D3g jHO2a-yUV-yT75xglTyjwC5WKHySXgu-iXq7x6N5JIRAcBh2-ka6sS5o61JHR35sFfXYpUiSiPj45XL nGhB7wbVwvq4mA3ur1bePg","expires_in":86400,"token_type":"Bearer"} ``` Copy the `access_token` value. The value is the JWT. 3. Use the JWT to run FQL queries as an end user in the Dashboard Shell or using the Fauna CLI. Using the CLI: ```cli fauna shell \ --secret ``` # Clerk integration This guide covers how to configure [Clerk](https://clerk.com/) as an [access provider](../../../learn/security/access-providers/) for a Fauna database. Once set up, end users can log in to Clerk to create a JWT for your client application. Your application can use the JWT as an [authentication secret](../../../learn/security/authentication/) to run queries on the user’s behalf. ## [](#before-you-start)Before you start To complete this guide, you’ll need: * A [Clerk](https://dashboard.clerk.com/sign-up) application. To create a Clerk application, see [Set up your Clerk account](https://clerk.com/docs/quickstarts/setup-clerk) in the Clerk docs. * A Fauna database with a [user-defined role](../../../learn/security/roles/) to use for JWTs created by Clerk. ## [](#get-the-clerk-frontend-api-url-and-jwks)Get the Clerk Frontend API URL and JWKS 1. In the Clerk Dashboard, navigate to the [**API keys**](https://dashboard.clerk.com/last-active?path=api-keys) page. 2. Click **Show API URLs**. 3. Copy and save the **Frontend API URL**. ![Clerk Frontend API URL](../../_images/integration/clerk-frontend-api-url.png) 4. On the **API keys** page, click **Show JWT public key**. Copy and save the **JWKS URL**. ![Clerk JWKS URL](../../_images/integration/clerk-jwks.png) You’ll use the URLs later in the guide. ## [](#configure-fauna)Configure Fauna 1. Log in to the [Fauna Dashboard](https://dashboard.fauna.com/) and select your database. 2. Select the **Access Providers** tab and click **Create Access Provider**. 3. Enter a **Name** for the access provider, such as **Clerk**. 4. Copy and save the **Audience** URL. You’ll use the URL later in the guide. 5. In **Issuer**, paste the Clerk Frontend API URL you copied earlier. Do not include a trailing slash (`/`). 6. In **JWKS Endpoint**, paste the Clerk JWKS URL you copied earlier. 7. Click **Create**. 8. Click the access provider you created to access its FSL schema. 9. Update the schema to include a user-defined role: ```fsl access provider Clerk { ... // Adds a user-defined role to JWTs created by the access provider. role } ``` Don’t change the values of the `issuer` or `jwks_uri` fields. 10. Click **Save**. ## [](#create-a-jwt-template-in-clerk)Create a JWT template in Clerk 1. In the Clerk Dashboard, navigate to the [**JWT templates**](https://dashboard.clerk.com/last-active?path=jwt-templates) page. 2. Click **New template** and select **Fauna**. 3. Enter a **Name** for the template, such as `fauna`. 4. In the **Claims** section, set the `aud` claim to the audience URL you copied earlier. 5. Click **Save**. ## [](#authenticate-queries-in-an-application)Authenticate queries in an application You can now use the JWT template to create Fauna JWTs in Clerk. You can use the JWTs to authenticate with Fauna as an end user. For example: ```javascript import React from 'react'; import { useAuth } from '@clerk/nextjs'; import { Client, fql } from "fauna"; const Example = () => { const { getToken } = useAuth(); const [message, setMessage] = React.useState(''); const makeQuery = async () => { let client; try { const secret = await getToken({ template: '' }); client = new Client({ secret: secret }); const response = await client.query(fql`'Hello World!'`); setMessage(response); } catch (error) { console.error(error); setMessage('Error occurred'); } finally { if (client) client.close(); } } return ( <>

Message: {message}

); }; export default Example; ``` # Microsoft Entra integration This guide covers how to use [Microsoft Entra](https://www.microsoft.com/en-ca/security/business/microsoft-entra) to authenticate with a Fauna database. When set up, Entra issues a JWT when end users log into your application. The JWT contains a Fauna [token](../../../learn/security/tokens/)'s authentication secret for the user in a private claim. Your application can use the secret to run queries on the user’s behalf. ## [](#before-you-start)Before you start To complete this guide, you’ll need: * A [Microsoft Azure](https://azure.microsoft.com/) account. * Familiarity with Microsoft Entra and Azure Functions. ## [](#authentication-flow)Authentication flow In this guide, you’ll set up an Azure function that creates a Fauna token when Entra issues a JWT. You’ll configure Entra to use a [custom claim provider](https://learn.microsoft.com/en-ca/entra/identity-platform/custom-claims-provider-reference) to add the token’s secret as a claim to the JWT. The setup uses the following authentication flow: 1. The client sends a request to Microsoft Entra. Entra invokes an Azure Function in the token (JWT) issuance phase. 2. The Azure Function generates a Fauna token and includes the token’s secret in a private `fauna_token_secret` claim in the payload of the JWT issued by Entra. 3. Entra returns the JWT to the client application. The client application uses the token secret to authenticate Fauna queries on the user’s behalf. For more on the general flow and setup, see [Create a REST API for a token issuance start event in Azure Functions](https://learn.microsoft.com/en-ca/entra/identity-platform/custom-extension-tokenissuancestart-setup?tabs=visual-studio%2Cazure-portal&pivots=azure-portal) in the Microsoft Entra docs. ## [](#configure-fauna)Configure Fauna 1. Log in to the [Fauna Dashboard](https://dashboard.fauna.com/) and select your database. 2. In the Dashboard Shell, run the following FQL query to create a key with the `server` role: ```fql Key.create({ role: "server" }) ``` ``` { id: "404130885316640841", coll: Key, ts: Time("2099-07-22T17:08:15.803Z"), role: "server", secret: "fn..." } ``` Copy the key’s `secret`. You’ll use it later. 3. Create a collection to store [identity documents](../../../learn/security/tokens/#identity-document) for your application’s end users. Edit the collection’s schema to include an `email` field or similar identifier used to uniquely identify users. For example: ```fsl collection Customer { unique [.email] index byEmail { terms [.email] } } ``` 4. Create one or more [user-defined roles](../../../learn/security/roles/#user-defined-role). Edit the role’s schema to include the previous collection in the role’s [`membership` property](../../../learn/security/roles/#role-membership). For example: ```fsl role customer { membership Customer privileges Product { read } ... } ``` ## [](#create-a-rest-api-in-azure)Create a REST API in Azure Use an Azure Function to create a REST API. The API creates a Fauna token for an end user. ### [](#create-a-azure-functions-app)Create a Azure functions app 1. Sign in to the [Azure Portal](https://portal.azure.com/#home). 2. From the **Home** page, select **Create a resource**. 3. Search for and select **Function App**. Then select **Create > Function App**. 4. Create a function app with the following settings: | Settings | Value | Description | | --- | --- | --- | --- | --- | | Plan | Consumption (Serverless) | Hosting plan that defines how resources are allocated to your function app. | | Subscription | Your subscription | The subscription under which the new function app will be created. | | Function App name | Enter a unique name | A unique name that identifies your new function app. | | Runtime | Node.js | This guide uses Node.js. You can use any runtime with a supported Fauna driver. | | Version | 20 LTS | | | Region | Your preferred region | | | Operating System | Windows | | 5. Select **Review + create** and then select **Create**. It may take a few minutes for Azure to deploy the function app. 6. Once deployed, select **Go to resource**. ### [](#set-up-environment-variables)Set up environment variables 1. In the Azure Portal, navigate to your function app. 2. In the function app’s **Overview** page, click **Settings > Environment variables** in the left navigation. 3. Click **Add**. 4. In the **Add/Edit application setting** pane, enter the following: * **Name**: `FAUNA_SECRET` * **Value**: The Fauna key’s `secret` you copied earlier. 5. Click **Apply**. 6. On the **Environment variables** page, click **Apply**. Then click **Confirm**. ### [](#create-an-http-trigger-function)Create an HTTP trigger function Create an HTTP trigger function in the Azure function app. The HTTP trigger lets you invoke a function with an HTTP request and is referenced by your Microsoft Entra custom authentication extension. 1. Navigate to the function app’s **Overview** page. 2. . In the **Functions** tab, and select **Create function** under **Create in Azure portal**. 3. Select the **HTTP trigger** template and select **Next**. 4. Enter a **Function name** and leave the **Authorization level** as **Function**. Then click **Create**. 5. Add the following code to the function: ```js const { Client, fql, FaunaError } = require("fauna"); module.exports = async function (context, req) { // Read the request body const requestBody = req.body; // Parse the request body const data = requestBody ? JSON.parse(requestBody) : null; const email = data?.data?.authenticationContext?.user.mail; const client = new Client({ secret: process.env.FAUNA_SECRET, }); const fauna_response = await client.query(fql` // If a Customer document with the email exists, get it. // Otherwise, create a Customer document. let user = Customer.byEmail(${email}).first() ?? Customer.create({email: ${email}}) // Create a token for the Customer document. let token = Token.create({ document: user, ttl: Time.now().add(30, 'minutes') }) // Return the Customer document's ID and the // token's secret. let payload = { userId: user!.id, token: token.secret } payload `); // Prepare the response object const response = { data: { '@odata.type': 'microsoft.graph.onTokenIssuanceStartResponseData', actions: [ { '@odata.type': 'microsoft.graph.tokenIssuanceStart.provideClaimsForToken', claims: { FaunaTokenSecret: fauna_response.data.token, } } ] } }; // Return the response context.res = { status: 200, body: response }; }; ``` 6. Save the function. 7. Click **Get function URL**. 8. Copy the **default (Function key)** URL. ### [](#install-the-fauna-driver)Install the Fauna driver 1. Navigate to the function app’s **Overview** page. 2. In the left navigation, select **Development Tools > Console**. 3. Navigate to the http trigger function’s directory. For example: ```bash cd HttpTrigger1 ``` 4. In the directory, run the following commands to install the Fauna driver: ```bash npm init -y npm install fauna --save ``` ### [](#register-a-custom-authentication-extension)Register a custom authentication extension 1. In the top navigation, click **Home**. 2. Search for and select **Microsoft Entra ID**. 3. From the Entra **Overview** page, select **Enterprise applications > Custom authentication extensions > Create a custom extension**. 4. In the **Basics** tab, select the **TokenIssuanceStart** event type and click **Next**. 5. In the **Endpoint Configuration** tab, enter the following: * **Name**: Enter a name for the custom extension. * **Target URL**: The Azure function URL you copied earlier. 6. Select **Next**. 7. In **API Authentication** tab, you will be presented with an option to select a app registration type. Select **Create new app registration**. 8. Enter an application **Name** and select **Next**. 9. In the **Claims** tab, enter the following claim: * `FaunaTokenSecret` 10. Select **Next** and then click **Create**. It may take a few minutes for Azure to create the application and custom extension. Once created, you’ll navigate to the custom extension’s **Overview** page. #### [](#grant-admin-consent)Grant admin consent 1. On the custom extension’s **Overview** page, copy the **App ID** under **API Authentication**. You’ll use the ID later in the guide. 2. Under **API Authentication**, click **Grant permission**. 3. If prompted, sign in to Azure and accept the permissions request. ### [](#configure-an-openid-connect-app)Configure an OpenID Connect app To get a token and test the custom authentication extension, you can use the [https://jwt.ms](https://jwt.ms) app from Microsoft. #### [](#register-a-test-web-app)Register a test web app 1. In the top navigation, click **Home**. 2. Search for and select Microsoft Entra ID. 3. In left navigation, select **App registrations > New registration**. 4. Enter a **Name** for the application. For example, `My test application`. 5. Under **Supported account types**, select **Accounts in this organizational directory only**. 6. Under **Redirect URI**, select **Web** and enter [https://jwt.ms](https://jwt.ms) as the URL. 7. Click **Register**. ![Register Test Web Application](../../_images/integration/register-test-web-application.png) 8. In the app registration’s **Overview** page, copy the: * **Application (client) ID** * **Directory (tenant) ID** ![Copy Application ID](../../_images/integration/get-the-test-application-id.png) #### [](#enable-implicit-flow)Enable implicit flow 1. From the app registration’s **Overview** page, navigate to **Manage > Authentication**. 2. Under **Implicit grant and hybrid flows**, check **ID tokens**. 3. Click **Save**. #### [](#enable-claims-mapping-policy)Enable claims mapping policy 1. In the left navigation, navigate to **Manage > Manifest**. 2. In the manifest, locate the `acceptMappedClaims` attribute, and set the value to `true`. 3. Set the `accessTokenAcceptedVersion` to `2`. 4. Select **Save** to save the changes. The following JSON snippet demonstrates how to configure these properties. ```json { "acceptMappedClaims": true, "accessTokenAcceptedVersion": 2 // .... rest of the manifest } ``` ### [](#assign-a-custom-claims-provider-to-your-app)Assign a custom claims provider to your app 1. In the top navigation, click **Home**. 2. Search for and select Microsoft Entra ID. 3. In the left navigation, select **Manage > Enterprise applications**. 4. Find and select the application you created from the list. 5. From the **Overview** page, select **Manage >Single sign-on** in the left navigation. 6. Next to **Attributes & Claims**, select **Edit**. ![Assign attributes](../../_images/integration/open-id-connect-based-sign-on.png) 7. Expand **Advanced settings**. 8. Next to **Custom claims provider**, select **Configure** 9. In the **Customer claims provider** pane, select the custom claims provider you created earlier. 10. Click **Save**. Next, assign the attributes from the custom claims provider, which should be issued into the token as claims: 1. On the **Attributes & Claims** page, select **Add new claim**. 2. Enter a **Name** of `FaunaTokenSecret`. 3. Select a **Source** of **Attribute**. 4. Select a **Source attribute** of `"customClaimsProvider.FaunaTokenSecret"`. 5. Select **Save**. ## [](#protect-your-azure-function)Protect your Azure Function The custom authentication extension uses a server-to-server flow to obtain an access token. The token is sent in the HTTP `Authorization` header to your Azure function. Use the following steps to add Microsoft Entra as an identity provider to your Azure Function app: 1. In the top navigation, click **Home**. app you previously published. 2. From the **Home** page, find and select the function app you created earlier. 3. In the left navigation, select **Settings > Authentication**. 4. Select **Add identity provider**. 5. Select an **Identity provider** of **Microsoft**. 6. Select a tenant type of **Workforce configuration**. 7. Under **App registration**, select an **App registration type** of **Pick an existing app registration**. Then select the custom authentication extension app you created earlier. 8. Enter the following issuer URL, [https://login.microsoftonline.com//v2.0](https://login.microsoftonline.com//v2.0), where `` is the Entra application’s tenant ID you copied earlier. 9. Select a **Unauthenticated requests** of **HTTP 401 Unauthorized** . 10. Uncheck **Token store**. 11. Select **Add**. ![Add Identity Provider](../../_images/integration/add-identity-provider.png) ### [](#test-the-application)Test the application 1. Open a new private browser and navigate and sign-in through the following URL. ```bash https://login.microsoftonline.com//oauth2/v2.0/authorize?client_id=&response_type=id_token&redirect_uri=https://jwt.ms&response_mode=form_post&scope=openid&state=12345&nonce=678910 ``` 2. Replace: * `` with the Entra application tenant ID you copied earlier. * `` with the Entra application’s client ID you copied earlier. 3. After logging in, you’ll be presented with your decoded token at [https://jwt.ms](https://jwt.ms). The decoded token will contain the `FaunaTokenSecret` claim field. # General requirements and limits ## [](#browser-rqmts)Browser requirements | Supported browser | Minimum version | Browser update link | | --- | --- | --- | --- | --- | | Chrome | 69 | Update Google Chrome | | Edge | 79 | Update to the new Microsoft Edge | | Firefox | 62 | Update Firefox to the latest release | | Safari | 12.1 | Update to the latest version of Safari | ## [](#glimits)Global limits | Constraint | Limit | | --- | --- | --- | --- | | Document size | 8 MB per document.Document size is the size of document once encoded. Fauna encodes documents in a binary format that’s smaller than raw JSON.FSL schema are stored as documents and are subject to this limit. | | HTTP request payload size | 16 MB per request | | Transaction size | 16 MB per transactionTransaction size is the total size of the transaction’s write. Read-only queries are not subject to this limit. Uses the size of binary-encoded document. | | Compute operations | 12,000 per transaction | | Default query execution time even without a timeout | 2 minutes | | Maximum query execution time even with a timeout | 10 minutes | | Maximum number of collections per database | 1,024Index builds for collections with more than 128 documents are handled by a background task. The limit prevents an excessive number of indexes that must be built simultaneously.If a transaction exceeds this limit, Fauna returns a limit_exceeded error code and a 429 HTTP status code. | | Index entries | 64 KBAn index entry includes terms and values fields and must not exceed 64 KB. | | Pagination size range | 1 to 16000 (inclusive)Minimum and maximum number of paginated values returned by the set.pageSize() or set.paginate() method. | | UDF recursion limit | User-defined function (UDF) recursion is limited to a depth of 2048 calls. | | Array size | 16,000 elements per Array. | | String size | 16,777,216 characters | | Array.sequence() range limit | When calling Array.sequence(), the difference between the start and end values can’t exceed 16,000. | | .fsl schema file limits | A database can have up to 1,024 .fsl files, including main.fsl. This limit does not include .fsl files for child databases. | # FQL language reference The FQL language reference document describes the FQL syntax and the fundamentals of using the language to query Fauna. FQL features a TypeScript-inspired syntax, which is extended to support concise relational queries and is, optionally, statically typed. This gives you a developer language that empowers you to write clear, concise queries and transactional code. A TypeScript-inspired database language is purpose-fit for Fauna. The resulting FQL syntax is familiar and easy to learn and scales to handle complex business logic, making it easy to realize the power of the Fauna operational database architecture. Unlike well-known relational query languages, which are keyword-heavy and require learning a new syntax for every feature, FQL has a small core syntax with dedicated shorthand for common query concepts, such as record predicates and projection. Occasionally, you might want to port a query from one context to another, such as interactively developing the query in the dashboard shell and copying it to your application code or applying your FQL knowledge in a different host programming environment. FQL has a dedicated syntax that applies to all contexts in which it exists, with drivers that implement a secure, string-based template system for embedded queries. This means that you can apply the same query syntax everywhere. # FQL syntax quick look This gives you a quick look at the Fauna Query Language (FQL) syntax. FQL is similar to JavaScript with influences from TypeScript and GraphQL, differing from those languages in that it is optimized for database operations. ## [](#basic-syntax)Basic syntax ```fql // Single-line comments start with double-slash. /* Block comments start with slash-asterisk and end with asterisk-slash. */ // Statements don't have to be terminated by ; ... (1 + 3) * 2 // ... but can be. (1 + 3) * 2; ///////////////////////////////////////////// // Numbers, Strings, and Operators // FQL has integer, decimal, and exponential values stored as // Int, Long, or Double types, which are all Number types. // An Int is a signed 32-bit integer type. // A Long is a signed 64-bit integer type. // Doubles are double-precision, 64-bit binary type, IEEE 754-2019. 3 // 3 1.5 // 1.5 3.14e5 // 314000.0 // Some basic arithmetic works as you'd expect. 1 + 1 // 2 0.1 + 0.2 // 0.30000000000000004 8 - 1 // 7 10 * 2 // 20 35 / 5 // 7 // Including uneven division. 5 / 2 // 2 // Precedence is enforced with parentheses. (1 + 3) * 2 // 8 // There's also a boolean type. true false // Strings are created with ' or ". 'abc' // "abc" "Hello, world" // "Hello, world" // Negation uses the ! symbol !true // false !false // true // Equality is == 1 == 1 // true 2 == 1 // false // Inequality is != 1 != 1 // false 2 != 1 // true // More comparisons 1 < 10 // true 1 > 10 // false 2 <= 2 // true 2 >= 2 // true // Strings are concatenated with + "Hello " + "world!" // "Hello world!" // and are compared with < and > "a" < "b" // true // Type coercion isn't performed for comparisons with double equals... "5" == 5 // false // You can access characters in a string with at() built-in string method. "This is a string".at(0) // "T" // ...or use an index. "Hello world"[0] // "H" // "length" is a property so don't use (). "Hello".length // 5 // There's also "null". null; // used to indicate a deliberate non-value // false, null, 0, and "" are falsy; everything else is truthy. ///////////////////////////////////////////// // Arrays, and Objects // Arrays are ordered lists containing values, of any type. let myArray = ["Hello", 45, true] myArray // ["Hello", 45, true] // Their members can be accessed using the square-brackets subscript syntax. // Array indices start at zero. You can also use the `at()` built-in method. myArray[1] // 45 myArray.at(1) // 45 // Arrays are of variable length. myArray.length // 3 // Accessing an Array index greater than or equal to the Array length: myArray[3] // results in an error. // Create an Array from elements in the range index 1 (include) to // index 4 (exclude). myArray.slice(1, 4) // [45, true] // FQL objects are equivalent to "dictionaries" or "maps" in other // languages: an unordered collection of key:value pairs. You can use // the dot syntax provided the key is a valid identifier. let myObj = {key1: "Hello", key2: "World"} myObj.key1 // "Hello" // Keys are strings but quotes aren't required if they're a valid // JavaScript identifier. Values can be any type. let myObj = {myKey: "myValue", "myOtherKey": 4} // Object attributes can also be accessed using the subscript syntax. myObj.myKey // "myValue" myObj.myOtherKey // 4 // If you try to access a value not yet set, you'll get an error. myObj.myThirdKey // results in an error. ///////////////////////////////////////////// // Variables // Use the "let" keyword to declare variables in a lexical scope and // assign a value to the variable. FQL is dynamically typed, so you don't // need to specify type. Assignment uses a single = character. Also, "let" // can't be the last statement, because it isn't an expression. let someVar = 5 someVar // 5 // A variable in the same scope can be assigned a new value: let someVar = 5 let someVar = 6 someVar // 6 // You can declare only one variable in the same `let` statement. // You can use "let" with an "if ... else" statement to // conditionally assign a variable value. let x = "cat" let y = if (x == "cat") { "Meow" } else if (x == "dog") { "Woof" } else { "Call the exterminator" } y // "Meow" ///////////////////////////////////////////// // Control structures, logic // The `if ... else` structure works as you'd expect. let count = 1 if (count == 3) { // evaluated if count is 3 } else if (count == 4) { // evaluated if count is 4 } else { // evaluated if not 3 or 4 } // `&&` is logical AND, `||` is logical OR let house = {size: "big", color: "blue"} if (house.size == "big" && house.color == "blue") { "big blue house" } if (house.color == "red" || house.color == "blue") { "red or blue" } ///////////////////////////////////////////// // Block scope // A block is defined with `{ }` and variables are scoped to the block. // Variables outside of the block are global. // The last statement in a block must be an expression. let greeting = "hi" { let greeting = "hello" greeting } // "hello" let greeting = "hi" { let greeting = "hello" greeting } greeting // "hi" ///////////////////////////////////////////// // Anonymous functions // FQL anonymous functions are declared using the short-form // arrow syntax. An anonymous function can't be called before the definition. (x) => { let greeting = "hello" greeting } // Objects can contain functions. let myFunc = { (x) => { let double = x + x double } } myFunc(3) // 6 // Some built-in methods accept single-argument functions. Customer.all().where(c => c.address.zipCode == "20002") // Which is equivalent to: let myFunc = { (c) => { c.address.zipCode == "20002" } } Customer.all().where(myFunc) ``` # Lexical elements Fauna Query Language (FQL) supports UTF-8 encoded [Unicode](https://home.unicode.org/) text. This section covers the basic character set, whitespace, line termination, and comments syntax elements. ## [](#char-set)Character set FQL supports the following character set: a b c d e f g h i j k l m n o p q r s t u v w x y z A B C D E F G H I J K L M N O P Q R S T U V W X Y Z 0 1 2 3 4 5 6 7 8 9 $ ! % & ( ) \* + , - . / : ; < = > ? @ \[ \] ^ \_ { | } ~ \` Schema entity naming might impose added restrictions on the allowed character set. ## [](#whitespace)Whitespace Whitespace may be used between FQL tokens and are ignored by the parse. Whitespace is a nonempty sequence of any of the following characters, as listed in the [ECMAScript Language Specification](https://tc39.es/ecma262/#sec-white-space): | Unicode code point | Description | Escape sequence | | --- | --- | --- | --- | --- | | U+0009 | Character tab | \t | | U+000B | Line tab | \v | | U+000C | Form feed | \f | | U+0020 | Space | | | U+00A0 | No-break space | | | U+FEFF | Zero-width no-break space | | Line termination and block-style comments are also treated as whitespace and can be used anywhere whitespace can be used. ## [](#line-terminators)Line terminators Line terminators can be used where permitted by the syntactic grammar, which is wherever whitespace is allowed. The following table lists the recognized line termination characters: | Unicode code point | Description | Escape sequence | | --- | --- | --- | --- | --- | | U+000A | Line feed | \n | | U+000D | Carriage return | \r | | U+2028 | Line separator | | | U+2029 | Paragraph separator | | ## [](#escape-char)Escaped characters Use a backslash (`\`) to escape a character in a [String](../types/#string): | Escape sequence | Escaped character | | --- | --- | --- | --- | | \0 | null | | \' | single quote | | \" | double quote | | \` | backtick | | \# | number symbol | | \b | backspace | | \t | horizontal tab | | \n | line feed, new line | | \v | vertical tab | | \f | form feed | | \r | carriage return | | \\ | backslash | | \xhh | character represented by the hexadecimal value hh | | \uhhhh | character represented by the Unicode code point hhhh | | \u{hh…​} | character represented by the arbitrary Unicode code point hh…​ | ## [](#comments)Comments FQL supports single-line and block comments. * Single-line Comments The `//` character sequence starts a single-line comment. A line feed terminates the comment. ```fql // This is a single-line comment "String" // This comment annotates a line ``` * Block Comments A block comment can be used where whitespace is permitted. The `/*` character sequence starts a block comment. The comment is terminated by the matching `*/` character sequence. A block comment can span multiple lines and block comments can be nested. ```fql /* This block comment is on one line */ "String" /* This comment annotates a line */ /* This comment spans multiple lines */ /* This comment spans multiple lines. /* This block comment is nested within a block comment. */ */ ``` # Literals Fauna Query Language (FQL) supports literal syntax for [Number](../types/#number)s, [Boolean](../types/#boolean) values, [Null](../types/#null), [String](../types/#string)s, [Array](../types/#array)s, [Object](../types/#object)s, and [Function](../types/#function)s. ## [](#boolean)Boolean A [Boolean](../types/#boolean) is a logical expression that evaluates to true or false. The [Boolean](../types/#boolean) literals are `true` and `false`. ## [](#null)Null The [Null](../types/#null) literal represents the absence of a value. In [Boolean](../types/#boolean) expressions, `null` evaluates to `false`. The [Null](../types/#null) literal is `null`. ## [](#literal-num)Number A [Number](../types/#number) is a base 10 [Int](../types/#int), decimal, or exponential. Underscores can be used as separators instead of commas to improve readability but aren’t significant otherwise. ### [](#integer)Integer An [Int](../types/#integer) literal starts with a nonzero digit and don’t include a fractional or exponential part. A negative integer literal is preceded by the `-` character. 10 \-250 1\_000\_000 Integer literals can have the following base representation: | Base | Example | | --- | --- | --- | --- | | 2 | 0b1001 | | 8 | 0o123 | | 10 | 12345 | | 16 | 0x12a3 | ### [](#decimal)Decimal A decimal literal includes a fractional part. Internally, these are handled as [Double](../types/#double)s. 1.0 0.1 ### [](#exponential)Exponential E-notation is used for a decimal exponential literal: __e | E__: 1.2e23 3.4e-15 where, * __ is the base integer. * `e` or `E` separator indicates an exponential literal. * __ is the signed integer exponent. ## [](#string)String A [String](../types/#string) literal is a sequence of single-quoted string values or double-quoted interpolated string values from the FQL-supported [character set](../lexical/#char-set), and terminated by `null`. Single-quoted [String](../types/#string)s can include the `"` and `#` characters. Double-quoted [String](../types/#string)s can include the `'` character. ### [](#interpolated-strings)Interpolated string Double-quoted [String](../types/#string)s support expression interpolation using the `#{}` character sequence, where __ is an FQL expression: ```fql "Alice is #{3 + 2} years old." ``` ``` "Alice is 5 years old." ``` See the [escaped characters](../lexical/#escape-char) section for a list of escaped characters. ### [](#heredoc)Heredoc string A heredoc [String](../types/#string) is a way to write a string that spans multiple lines, enclosing the string with a beginning and ending user-defined token: ```fql <<+TOKEN A multiline string with leading whitespaces TOKEN ``` ``` <<-END A multiline string with leading whitespaces END ``` The token, on a line by itself, terminates the [String](../types/#string). On rendering, whitespace is removed from form each line, maintaining the same relative indentation for each line. Heredoc strings support interpolation: ```fql let weather = "overcast with occasional showers" <<+EOS Hello. Today is #{weather} EOS ``` ``` <<-END Hello. Today is overcast with occasional showers END ``` To declare a [String](../types/#string) with variables but without doing interpolation, precede the [String](../types/#string) with a token in the format `<<-TOKEN`: ```fql <<-ASIS str => "Hello #{str}" ASIS ``` ``` <<-END str => "Hello #{str}" END ``` You might want to avoid interpolation when defining a Function body, for example. The `#{str}` variable executes in the context of the function instead of when the string is rendered. ## [](#array)Array [Array](../types/#array)s group items and are represented as a comma-separated list of literals or expressions enclosed by `[ ]`: ```fql ["a", 1 + 1, if (true) "true" else "false", null] ``` ``` [ "a", 2, "true", null ] ``` [Array](../types/#array) members can be accessed individually using a single variable. Members are indexed left to right, starting with 0. In the example, the `"a"` member is Array element 0 and the `null` member is array element 3: ```fql let array = ["a", 1 + 1, if (true) "true" else "false", null] [ array[0], array[3]] ``` ``` [ "a", null ] ``` ## [](#object)Object An [Object](../types/#object) is represented as a key:value pair enclosed by `{ }`, or field:value pair when referring to schema entities. ```fql { "a": 1 } ``` An object can hold multiple, comma-separated key:value pairs: ```fql { "a": 1, b: 2 } ``` Key or field names can be identifiers or [String](../types/#string) literals: ## [](#anonymous-function)Anonymous function An anonymous function is represented using the short-form, JavaScript-like arrow function syntax: (\[parameter\[, parameter\]\]) => {\[statement \[statement …​ \]\] expression \[expression …​ \]} Or, simplified: (parameter) => { expression } If the function has only one parameter, `()` can be omitted around the parameter. If the function body has only a single expression, the `{}` block delimiter can be omitted. # Reserved words ## [](#reserved-fql-keywords-and-type-names)Reserved FQL keywords and type names The following FQL keywords and types names are reserved and may not be used as a variable, parameter, or top-level entity name. Keywords: | at | false | if | let | null | true | | --- | --- | --- | --- | --- | --- | --- | --- | Types: | Array | Boolean | Bytes | Date | Double | | --- | --- | --- | --- | --- | --- | --- | | Function | Long | Null | Number | Object | | String | Timestamp | Tuple | Union | Uuid | ## [](#reserved-schema)Reserved schema names The following schema entity, method, and metadata names may not be used as top-level, user-defined entity names. The acronym `FQL`, itself, is reserved. This means `FQL.Collection` can be called to get the Collection module if needed. Schema entities: | AccessProvider | Collection | Credential | Database | Doc | | --- | --- | --- | --- | --- | --- | --- | | Function | Index | Key | Query | Role | | Token | View | | | | Schema methods: | delete | replace | update | updateData | replaceData | | --- | --- | --- | --- | --- | --- | --- | Schema metadata fields: | coll | id | ts | ttl | | --- | --- | --- | --- | --- | --- | Schema data field: | data | | | | | --- | --- | --- | --- | --- | --- | ## [](#reserved-index-names)Reserved index names The following names can’t be used as [Collection](../../fql-api/collection/) names: | documents | events | self | sets | | --- | --- | --- | --- | --- | --- | ## [](#see-also)See also [Aliasing](../naming/#aliasing) [Document field name collision](../naming/#collision) # Types This pages information about FQL data types, including a list of available types. FQL supports [static typing](../static-typing/) with optional typechecking. ## [](#encode-fql-types-as-json)Encode FQL types as JSON When transmitting FQL data, the [Fauna Core HTTP API](../../http/reference/core-api/) encodes FQL data types as JSON using one of two data formats: * [Tagged format](../../http/reference/wire-protocol/#tagged): Tags JSON values with FQL type annotations, ensuring lossless typing. * [Simple format](../../http/reference/wire-protocol/#simple): Lossy format that converts FQL values to their closest JSON type, without annotations or transformations. The Core API’s [Query endpoint](../../http/reference/core-api/) uses the simple format by default. For more information about data formats, see [Wire protocol: Encode FQL as JSON](../../http/reference/wire-protocol/). ## [](#check-a-values-type)Check a value’s type Use the [`isa`](../operators/#isa) operator to check if a value is of a specific FQL type. For example: ```fql "foo" isa String // true 123 isa String // false 123 isa Int // true 0.123 isa Double // true 123 isa Number // true 0.123 isa Number // true { a: "foo", b: "bar" } isa Object // true [ 1, 2, 3 ] isa Array // true Product.all() isa Set // true // For documents, the type is the name // of the collection. For example, `Product` //collection documents have a type of `Product.` Product.byName('limes').first() isa Product // true // Document references also resolve to the // document type. Product.byId('111') isa Product // true // Dangling references, which point to documents // that don't exist, still resolve to the // document type. For example, the following // document doesn't exist. Product.byId('999') isa Product // true ``` `isa` doesn’t support [parameterized generic types](../static-typing/#generic), such as `Ref` or `Array`. Queries that attempt to use `isa` with such types return an `invalid_query` error: ```fql // NOT SUPPORTED: [ 1, 2, 3 ] isa Array // NOT SUPPORTED: Product.all() isa Set // NOT SUPPORTED: Product.byId('111') isa Ref ``` ## [](#persistable)Persistable types Persistable types have values that can be reliably stored, retrieved, and updated in a Fauna database. Documents can only store persistable values. [FSL collection schema: Field definitions](../../fsl/field-definitions/) only allow persistable types. Persistable types are: * [Boolean](#boolean) * [Date](#date) * [Ref](#ref) (document references), excluding references to [named system collection](../../../learn/data-model/collections/#named-coll) documents * [Null](#null) * [Number](#number), including [Double](#double), [Int](#int), and [Long](#long) * [String](#string) * [Time](#time) * [Array](#array) of other persistable types * [Object](#object) of other persistable types * [Tuple](#tuple) of other persistable types * [Union](#union) of other persistable types ## [](#scalar)Scalar types The scalar types are JSON serializable types that hold a single value. Types, including [String](#string), [Date](#date), and [Time](#time), are objects that have built-in methods and properties. ### [](#boolean)Boolean A boolean data type has a boolean literal value of `true` or `false`. Boolean variables and expressions can be compared explicitly or implicitly against a boolean literal. In the following example, the `if` statements are equivalent: ```fql let a = true if (a == true) { // expression } if (a) { // expression } ``` Comparison operators return a boolean value: ```fql let a = 10 let b = 20 a > b ``` ``` false ``` ### [](#bytes)Bytes | Reference: Bytes | | --- | --- | --- | A [Bytes](#bytes) value stores a byte array, represented as a Base64-encoded string. You can use [Bytes](#bytes) to store binary data in a Fauna database. ### [](#date)Date | Reference: Date | | --- | --- | --- | A date in the `YYYY-MM-DD` format, such as `2099-11-03`. A [Date](#date) value is encoded as a string in responses. ### [](#double)Double See [Number](#number). ### [](#id)ID A 64-bit integer represented as a decimal number that uniquely identifies a resource. ### [](#int)Int See [Number](#number). ### [](#long)Long See [Number](#number). ### [](#null)Null The Null type has the single value `null`. Null is a marker used to indicate that a data value doesn’t exist. It is a representation of missing information, indicating a lack of a value, which differs from a value of zero. ### [](#nulldoc)NullDoc A marker used to indicate that a document doesn’t exist or is inaccessible. The data type is taken from the collection’s name with the `Null` prefix. For example, a `NullDoc` for the `Product` collection has the `NullProduct` type. NullDocs coalesce as a `null` value. Testing a `NullDoc` against a value of `null` returns `true`. `NullDoc` is returned by the `byId()` or `byName()` queries and by links from another doc (`id` or `coll`) when the linked document doesn’t exist or is inaccessible. The following lists `NullDoc` types for system collections: | NullDoc type | Description | Value | | --- | --- | --- | --- | --- | | NullAccessProvider | Returned type when an AccessProvider document doesn’t exist. | null | | NullCollectionDef | Returned type when a Collection document, which has the CollectionDef type, doesn’t exist. | null | | NullCredential | Returned type when a Credential document doesn’t exist. | null | | NullDatabaseDef | Returned type when a Database document, which has the DatabaseDef type, doesn’t exist. | null | | NullFunctionDef | Returned type when a Function document, which has the FunctionDef type, doesn’t exist. | null | | NullKey | Returned type when a Key document doesn’t exist. | null | | NullRole | Returned type when a Role document doesn’t exist. | null | | NullToken | Returned type when a Token document doesn’t exist. | null | ### [](#number)Number FQL has built-in types that represent numbers, which are summarized in the following table: | Type | Size, bits | Description | Range | Examples | | --- | --- | --- | --- | --- | --- | --- | | Double | 64 | double-precision IEEE-754 floating point | | 1.12341.2e23 | | Int | 32 | Signed two’s complement integer | -231 to 231-1 | 10-2501_000_000 | | Long | 64 | Signed two’s complement integer | -263 to 263-1 | 922337036854775808-9223372036854775808 | Underscores are permitted in numeric [literals](../literals/#literal-num) as separators to aid readability but have no other significance. ### [](#string)String | Reference: String | | --- | --- | --- | String literals are single-quoted string values or double-quoted interpolated string values from the FQL-supported [character set](../lexical/#char-set). Single-quoted strings can also include the `"` and `#` characters. Double-quoted strings can also include the `'` character. Use `\` to escape a character in a string. #### [](#interpolated-strings)Interpolated strings Double-quoted strings support the interpolation of expressions in a string. Encode the interpolation string using the `#{}` character sequence, where __ is an FQL expression: ```fql "Alice is #{3 + 2} years old." ``` evaluates to: ``` "Alice is 5 years old." ``` #### [](#heredoc-strings)Heredoc strings You can use heredoc strings for multiple lines of text that behave similarly to double-quoted strings. A heredoc string is delimited by tokens that start with the `<<` character sequence followed by a user-defined character and a line break. Conventionally, the token is uppercase characters. The same token terminates a heredoc string and is on a line by itself: ```fql <<+STR A multiline string STR ``` A heredoc string removes leading whitespace at the beginning of each line, removing the same number of characters in each line: ```fql <<+STR A multiline string with leading whitespaces STR ``` evaluates to: ``` <<-END A multiline string with leading whitespaces END ``` ### [](#time)Time | Reference: Time | | --- | --- | --- | The Time type is an instant in time expressed as a calendar date and time of day in UTC, in the range `-999999999-01-01T00:00:00Z` to `9999-12-31T23:59:59.999999999Z` A [Time](#time) value can store nanosecond precision. Times can be inserted with offsets but are converted to UTC and the offset component is lost. A [Time](#time) value is encoded as a string in responses. ### [](#transactiontime)TransactionTime | Reference: TransactionTime | | --- | --- | --- | The TransactionTime type is the query transaction expressed as a calendar date and time of day in UTC, in the range `-999999999-01-01T00:00:00Z` to `9999-12-31T23:59:59.999999999Z` A [TransactionTime](#transactiontime) value can store nanosecond precision. A [Time](#time) value is encoded as a string in responses. ### [](#uuid)Uuid Universally Unique IDentifier (UUID) uniquely identifies a resource. ## [](#special)Data reference types The data reference types represent Fauna resources and have built-in methods and properties. ### [](#collection)Collection | Reference: Collection | | --- | --- | --- | A Collection groups documents in a database. ### [](#collectiondef)CollectionDef | Reference: Collection | | --- | --- | --- | A Collection definition, represented as a [`Collection` document](../../fql-api/collection/#collection). A collection definition is the FQL equivalent of an FSL [collection schema](../../../learn/schema/#collection-schema). ### [](#database)Database A database can have Collections, Documents, User-defined functions, security elements such as Keys, Tokens, Credentials, Access Providers, and child databases. ### [](#databasedef)DatabaseDef | Reference: Database | | --- | --- | --- | A database definition, represented as a [`Database` document](../../fql-api/database/#collection). ### [](#document)Document | Reference: Document | | --- | --- | --- | A record in a [Collection](#collection). You add documents to a collection as JSON-like objects. The document type is taken from the name of the collection of which the document is a member. For example, a document in the `Product` collection is of type `Product`, and if the requested document isn’t a member of the collection, the response type is `NullProduct`. All documents have [metadata fields](../../../learn/data-model/documents/#meta). ### [](#ref)Ref (Document reference) | Learn: Model relationships using document references | | --- | --- | --- | A reference to a [Document](#document). Can resolve to an existing document or a [NullDoc](#nulldoc). Document references contain the document’s [collection](../../../learn/data-model/collections/) and [document ID](../../../learn/data-model/documents/). You can use document references to [model relational data](../../../learn/data-model/relationships/) and create [relationships between documents](../../../learn/data-model/relationships/). ### [](#namedref)NamedRef A [reference](../../../learn/data-model/relationships/) to a [Document](#document) in a [named system collection](../../../learn/data-model/collections/). Can resolve to an existing document or a [NullDoc](#nulldoc). In named collections, each document is uniquely identified by its name instead of a [document ID](../../../learn/data-model/documents/). Named references contain the document’s [collection](../../../learn/data-model/collections/) and `name`. ### [](#function)Function | Reference: Function | | --- | --- | --- | A function is an FQL expression stored in a database. ### [](#functiondef)FunctionDef | Reference: Function | | --- | --- | --- | A [user-defined function (UDF)](../../../learn/schema/user-defined-functions/) definition, represented as a [`Function` document](../../fql-api/function/#collection). A function definition is the FQL equivalent of an FSL [function schema](../../fsl/function/). ## [](#advanced)Advanced types The advanced types represent complex objects or types that can hold multiple values. The iterable [Array](#array) and [Set](#set) types have built-in properties and methods. ### [](#any)Any The Any type denotes a field that can be of any other type or might be not present. ### [](#array)Array | Reference: Array | | --- | --- | --- | The [Array](#array) type is a comma-separated list of expressions enclosed by `[]`. An Array can hold mixed types, including functions: ```fql ["a", 1 + 1, if (true) "true" else "false", null] ``` evaluates to: ``` [ "a", 2, "true", null ] ``` For [`array.map()`](../../fql-api/array/map/), [`array.forEach()`](../../fql-api/array/foreach/), and similar methods, Array literals are evaluated left to right. The Array type is an iterable data structure that has an ordered collection of typed values. An Array: * can hold values of different types. * can be accessed only using positive integers. * is zero-indexed. * can’t contain more than 16,000 elements. You can use a [check constraint](../../fsl/check/) with [`array.distinct()`](../../fql-api/array/distinct/) to enforce distinct values within a document’s [Array](#array) field. See [Enforce unique values in an Array field](../../fsl/check/#unique-array-val). ### [](#event-source)Event Source | Reference: Event feeds and event streams | | --- | --- | --- | A string-encoded token representing an [event source](../../../learn/cdc/): ``` "g9WD1YPG..." ``` When tracked changes occur in a database, the event source emits a related event. To create an event source, use an FQL query that calls [`set.eventSource()`](../../fql-api/set/eventsource/) or [`set.eventsOn()`](../../fql-api/set/eventson/) to a [supported Set](../../../learn/cdc/#sets). You can use the token to consume the source’s events as an [event feed or event stream](../../../learn/cdc/). For supported event source methods, see [EventSource instance methods](../../fql-api/event-source/#instance-methods). ### [](#iterable)Iterable The type of all iterable types: * Array type * Set type ### [](#never)Never FQL method signatures use the Never type to represent a value that never occurs or the return value of a function that never returns. See [`abort()`](../../fql-api/globals/abort/) for an example. ### [](#module)Module The Module type is a singleton object that represents a grouping of functionality. Examples include [Math](../../fql-api/math/), [Query](../../fql-api/query/), and [Function](../../fql-api/function/), and [Collection](../../fql-api/collection/)s. A module gets serialized as an `@mod` value in the tagged format. Use the [isa](../operators/#comparison) operator for testing a module type. ### [](#object)Object The Object type is the type of all objects and represents a JSON-like object whose contents are a collection of key:value pairs. The keys must be strings and the values must be valid FQL data types. The value expressions are evaluated sequentially in the order defined, left to right. Objects evaluate to their contents. Objects can be combined to emulate abstract data types found in other functional languages. Object subtypes: * [Struct](#struct) * [Document](#document) Object types may have zero or more field types and an optional wildcard field type. Examples: ```fql { x: Number, y: Number } // object with two fields { kind: "dog" | "cat" | "bird", age: long } ``` The wildcard field type opens an object to arbitrary fields that correspond to the wildcard type: ```fql // An object with one `long` field and any number of `string` fields { count: Long, *: String } // An object with any number of arbitrary fields { *: Any } ``` You can use [`Object` static methods](../../fql-api/object/) to further transform objects. ### [](#set)Set | Reference: Set | | --- | --- | --- | A Set is an iterable group of values, typically representing documents in a collection. ### [](#singleton)Singleton Every primitive value is also a Singleton type. * Strings: `"foo" "bar"` * Integers: `1 2 3 -99` * Booleans: `true` or `false` * `null` Singletons can be combined to emulate abstract data types found in other functional languages. ### [](#struct)Struct A Struct is a plain [Object](#object) that isn’t a [Document](#document). The value returned from projection on a document is a [Struct](#struct). ### [](#tuple)Tuple Tuples are sequences of zero or more typed values: ```fql-sig [] // The empty tuple + [string, string] // tuple of two string values + [boolean] // tuple of one boolean value + [string, long, boolean] // tuple of a string, long, and boolean + [string[], long[]] // tuple of an Array of strings and an Array of longs ``` Tuple values are subtypes of Arrays of the union of the tuple slot types: ```fql-sig [string, string] // `string[]` + [boolean] // `boolean[]` + [string, long, boolean] // `(string | long | boolean)[]` ``` ### [](#union)Union Union type allows values of any constituent types: ```fql-sig boolean | number // all booleans and numbers + { x: number } | string // all the object types and strings ``` Unions can be combined to emulate abstract data types found in other functional languages: ```fql-sig { type: "point", x: long, y: long } | { type: "circle", x: long x: long, radius: long } ``` Unlike TypeScript, FQL unions that include [Any](#any) preserve both [Any](#any) and the original types. [Any](#any) does not subsume the other types. This allows for more precise type checking and errors. For example: ```fql let foo: Any | Number = 2 foo // Returns `2` with a type of `Any | Number` ``` The stricter typechecking can impact behavior at runtime. For example, the following TypeScript compiles but would return an error at runtime: ```typescript const foo: any = null; const bar = (true) ? foo : 2; // Attempts to access a property of // null. TypeScript allows this. bar.baz; ``` An equivalent FQL query returns an `invalid_query` error during typechecking: ```fql let foo: Any = null let bar = if (true) foo else 2 // Attempts to access a property of // null. FQL returns an error. bar.baz ``` ## [](#security)Security-related types The security data types are used with Fauna authentication and authorization APIs. ### [](#accessprovider)AccessProvider | Reference: Access providers | | --- | --- | --- | An [`AccessProvider` document](../../fql-api/accessprovider/). `AccessProvider` documents are FQL versions of a database’s FSL [access provider schema](../../fsl/access-provider/). `AccessProvider` documents have the [AccessProvider](#accessprovider) type. See [Access providers](../../../learn/security/access-providers/). ### [](#credential)Credential | Reference: Credential | | --- | --- | --- | A Credential object represents a Credential in a Fauna database. ### [](#key)Key | Reference: Key | | --- | --- | --- | A Key is a JSON object document subtype stored in a database Key collection that represents an [Key](../../fql-api/key/). A key ensures anonymous access to a database to execute operations permitted by the role associated with the key. ### [](#role)Role | Reference: Role | | --- | --- | --- | A Role is a JSON object document subtype stored in a database Role collection that represents an [Role](../../fql-api/role/). ### [](#token)Token | Reference: Token | | --- | --- | --- | Tokens are defined as documents in the Token collection, and are used to control identity-based access to a database. # Variables and identifiers This section describes the syntax for variable declaration and the rules for associating named identifiers with variables. ## [](#variables)Variables Variables are used to store values that are one of the FQL [Types](../types/). You assign a value to the variable using the [`let`](../statements/#let) statement: ```fql let foo = 5 ``` By default, variables are immutable so you can’t assign a new value to the variable. Assigning a new value to the variable `x` gives an error: ```fql let x = 5 x = 6 ``` ``` invalid_query: The query failed 1 validation check error: Expected end-of-input at *query*:2:3 | 2 | x = 6 | ^ | ``` But you can redeclare a variable: ```fql let a = 2 let a = 4 a ``` ``` 4 ``` You can also use a previous definition of the variable to redeclare the variable: ```fql let x = 1 let x = x + 1 x ``` ``` 2 ``` Here is a more complex redefinition example using an anonymous function, which shows the effect of the order of declaration: ```fql let x = "foo" let fn = () => x let x = "bar" let y = fn() [y, x] ``` ``` [ "foo", "bar" ] ``` Variables are scoped to the block in which they’re declared as shown in this example: ```fql let x = "foo" let y = { let x = "bar" x // a block must end with an expression } [x, y] ``` ``` [ "foo", "bar" ] ``` ## [](#identifier)Identifiers An identifier associates a name with a value. The name can be used to refer to a variable, property, field, or resource. * An identifier must start with a character from the set `[a-zA-Z_]` and be followed by zero or more characters from the set `[a-zA-Z0-9_]`. An identifier defined with the [let](../statements/#let) statement can’t be a single underscore (`_`) character. The single underscore is allowed as an anonymous function parameter and can be repeated in an anonymous function parameter list. * Identifiers are case-sensitive. * An identifier can’t be a [reserved](../reserved/) word except as described in [Naming and aliasing](../naming/). ## [](#see-also)See also [Naming and aliasing](../naming/) [Blocks and lexical scoping](../blocks/) # Blocks and lexical scoping ## [](#blocks)Blocks A block is an expression that encapsulates one or more statements or expressions, and is enclosed by `{ }`: ```fql { let a = 1 let b = 2 a + b } ``` ``` 3 ``` The last statement in a block must be an [expression](../statements/). A block is itself an expression, and has the type and result of the last expression of the block: ```fql let x = { let a = 1 let b = 2 a + b } x ``` ``` 3 ``` ## [](#scope)Scope A block defines variable scope. Variables declared in a block are scoped to the block and can’t be referenced outside of the block. ```fql let x = "foo" let y = { let x = "bar" x } [x, y] ``` ``` [ "foo", "bar" ] ``` In the example, two `x` variables are declared. One with local scope and one with global scope. Variable `y` has global scope and is the value of the block. # Field accessors and method chaining This section covers field accessors and method chaining. ## [](#field-accessor)Field access Accessors can be viewed as keys for referencing fields or properties in an associative array, dictionary, or lookup table. In Fauna documents, fields can be accessed using dot notation or bracket notation. ### [](#dot-notation-field-accessor)Dot notation field accessor The dot prefix (`.`) preceding the field name is used to access a field or property, as shown by the `.aKey` notation in this example: ```fql let object = { aKey: "one", bKey: "two" } object.aKey ``` ``` "one" ``` Providing a field that doesn’t exist returns an error. ### [](#bracket-notation-field-accessor)Bracket notation field accessor The following example uses bracket notation (`[ ]`), passing the field name as a string to access the field or property: ```fql let object = { aKey: "one", bKey: "two" } object["aKey"] ``` ``` "one" ``` Providing a field that doesn’t exist returns `null`. Using bracket notation allows you to dynamically access document fields as shown in this example: ```fql let object = { aKey: "one", bKey: "two" } let x = "aKey" object[x] ``` ``` "one" ``` ## [](#method-chaining)Method chaining Methods can be chained using dot notation to compose complex queries where the output of the previous method is the input of the next method. In this example, the query returns up to ten documents from all the documents in the `Product` collection: ```fql Product.all().take(10) ``` ## [](#optional-chaining)Optional chaining A variant of dot notation, the [optional chaining](../operators/#optional-chaining) operator can be used to access a field or invoke a method, returning `null` instead of an error if the left side of the expression evaluates to `null`. Example: ```fql let customer = { name: "Alice Appleseed", address: { state: "DC" } } customer.address?.state // Returns `"DC"` ``` # Operators This section describes the FQL operators. See [Operator precedence](../precedence/) for the operator precedence and associativity table. ## [](#assignment)Assignment | Operator | Syntax | Description | | --- | --- | --- | --- | --- | | = | variable [:type] = value | Simple assignment operator assigns the value to the declared variable.The optional :type notation constrains a value to the given type, where type is one of the supported Types. For example,let x:String = "5"is a valid statement, whereaslet x:String = 5results in an invalid_query error.You can use an if …​ else statement to conditionally assign a variable value. For an example, see Conditional assignment. | ## [](#arithmetic)Arithmetic The arithmetic operators perform arithmetic operations on numeric operands. | Operator | Syntax | Description | | --- | --- | --- | --- | --- | | + | operand1 + operand2 | Addition, sums the operands. | | - | operand1 - operand2 | Subtraction, subtracts operand2 from operand1. | | * | operand1 * operand2 | Multiplication, multiplies the operands. | | / | operand1 / operand2 | Division, divides operand1 by operand2. | | % | operand1 % operand2 | Modulo, returns the remainder of operand1 divided by operand2 and takes the sign of the dividend. | | ** | operand1 ** operand2 | Exponentiation, returns the result of raising operand1 to the power of operand2. | ## [](#concatenation)Concatenation The plus operator performs concatenation on sting operands. | Operator | Syntax | Description | | --- | --- | --- | --- | --- | | + | operand1 + operand2 | For String operands, concatenates the operands, left-to-right. | ## [](#comparison)Comparison Comparison operators return a [Boolean](../types/#boolean) value. For the `<`, `>`, `⇐`, and `>=` operators, comparison across types always returns `false` and comparing non-comparable objects always returns `false`. | Operator | Syntax | Description | | --- | --- | --- | --- | --- | | == | expression1 == expression2 | Equal to. Returns true if expression1 and expression2 have the same value. Otherwise, returns false.When comparing documents, only the document id metadata fields are compared and all other fields are ignored.Sets are equal if they have the same elements and the elements are in the same order. | | != | expression1 != expression2 | Not equal to. Returns true if expression1 and expression2 do not have the same value. Otherwise, returns false. | | > | expression1 > expression2 | Greater than. Returns true if expression1 is greater than expression2.Comparison across types returns false.Comparison of non-comparable objects, such as anonymous functions and Set cursors, returns false.Comparing Sets isn’t supported but a Set can be converted to an Array and then compared. See set.toArray(). | | >= | expression1 >= expression2 | Greater than or equal to. Returns true if expression1 is greater than or equal to expression2.Comparison across types returns false.Comparison of non-comparable objects, such as anonymous functions and Set cursors, returns false.Comparing Sets isn’t supported but a Set can be converted to an Array and then compared. See set.toArray(). | | < | expression1 < expression2 | Less than. Returns true if expression1 is less than expression2.Comparison across types returns false.Comparison of non-comparable objects, such as anonymous functions and Set cursors, returns false.Comparing Sets isn’t supported but a Set can be converted to an Array and then compared. See set.toArray(). | | <= | expression1 <= expression2 | Less than or equal to. Returns true if expression1 is less than or equal to expression2.Comparison across types returns false.Comparison of non-comparable objects, such as anonymous functions and Set cursors, returns false.Comparing Sets isn’t supported but a Set can be converted to an Array and then compared. See set.toArray(). | | isa | expression1 isa | Evaluate if expression1 is of type , which can be any of the supported runtime Types. The must be a module object.Example: "1" isa String returns true | ### [](#check-a-values-type-with-isa)Check a value’s type with `isa` Use the [`isa`](#isa) operator to check if a value is of a specific FQL type. For example: ```fql "foo" isa String // true 123 isa String // false 123 isa Int // true 0.123 isa Double // true 123 isa Number // true 0.123 isa Number // true { a: "foo", b: "bar" } isa Object // true [ 1, 2, 3 ] isa Array // true Product.all() isa Set // true // For documents, the type is the name // of the collection. For example, `Product` //collection documents have a type of `Product.` Product.byName('limes').first() isa Product // true // Document references also resolve to the // document type. Product.byId('111') isa Product // true // Dangling references, which point to documents // that don't exist, still resolve to the // document type. For example, the following // document doesn't exist. Product.byId('999') isa Product // true ``` `isa` doesn’t support [parameterized generic types](../static-typing/#generic), such as `Ref` or `Array`. Queries that attempt to use `isa` with such types return an `invalid_query` error: ```fql // NOT SUPPORTED: [ 1, 2, 3 ] isa Array // NOT SUPPORTED: Product.all() isa Set // NOT SUPPORTED: Product.byId('111') isa Ref ``` ## [](#logical)Logical Logical operators return a [Boolean](../types/#boolean) value. | Operator | Syntax | Description | | --- | --- | --- | --- | --- | | || | operand1 || operand2 | Logical OR. Returns true if one of the operands is true. Otherwise, returns false. | | && | operand1 && operand2 | Logical AND. Returns true if the operands are true. Otherwise, returns false. | ## [](#bitwise-operators)Bitwise operators | Operator | Syntax | Description | | --- | --- | --- | --- | --- | | | | expression1 | expression2 | Bitwise inclusive OR: expression1expression2expression1 | expression2000011101111 | expression1 | expression2 | expression1 | expression2 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | | expression1 | expression2 | expression1 | expression2 | | 0 | 0 | 0 | | 0 | 1 | 1 | | 1 | 0 | 1 | | 1 | 1 | 1 | | ^ | expression1 ^ expression2 | Bitwise exclusive OR (XOR): expression1expression2expression1 ^ expression2000011101110 | expression1 | expression2 | expression1 ^ expression2 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | | expression1 | expression2 | expression1 ^ expression2 | | 0 | 0 | 0 | | 0 | 1 | 1 | | 1 | 0 | 1 | | 1 | 1 | 0 | | & | expression1 & expression2 | Bitwise AND: expression1expression2expression1 & expression2000010100111 | expression1 | expression2 | expression1 & expression2 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | | expression1 | expression2 | expression1 & expression2 | | 0 | 0 | 0 | | 0 | 1 | 0 | | 1 | 0 | 0 | | 1 | 1 | 1 | ## [](#unary)Unary | Operator | Syntax | Description | | --- | --- | --- | --- | --- | | - | -operand- operand | Unary negation. Negates the operand it precedes. | | ! | !operand | Logical negation (NOT). Inverts the value of the Boolean operand and returns a Boolean value. | Examples: Unary negation variations: ```fql -5 // -5. A negative literal, not unary negation. - 5 // -5. Unary negation applied to a literal. let num = 5 -num // -5. Unary negation of a variable. ``` Insert a space between `-` and _operand_ if _operand_ is a literal [Number](../types/#number). Logical NOT: ```fql !false // true ``` ## [](#optional-chaining)Optional chaining | Operator | Syntax | Description | | --- | --- | --- | --- | --- | | ?. | object.value?.fieldobject.method?.(args) | Optional chaining. When accessing a field or invoking a method, if the left side of the expression evaluates to null, return null.If chained to other methods, the null value is passed to the method. For example, {a: null}.a?.toString() passes null to string.toString(), which returns "null", a String representation of Null. | Examples: ```fql let customer = { name: "Alice Appleseed", address: { state: "DC" } } customer.address?.state ``` The optional chaining operator can also be used with methods: ```fql let customer = { name: "Alice Appleseed", address: { state: "DC" } } customer.name?.toLowerCase() ``` ## [](#null-coalescing)Null coalescing | Operator | Syntax | Description | | --- | --- | --- | --- | --- | | ?? | expression1 ?? expression2 | Null coalescing. If expression1 evaluates to null, return the expression2 result. Otherwise, return the expression1 result. | Example: ```fql // Gets a `Customer` collection document. let customer = Customer.byEmail("carol.clark@example.com").first() // This customer's `cart` field is `null`. customer?.cart ?? "Not found" // returns "Not found" ``` ## [](#non-null)Non-null assertion postfix | Operator | Syntax | Description | | --- | --- | --- | --- | --- | | ! | expression! | Non-null assertion postfix. Runtime validation: if expression evaluates to null, return an error. Otherwise, return the result. | Example: ```fql let customer = { name: "Alice Appleseed", address: { state: "DC" } } customer.date! // Returns an error. ``` ## [](#ternary-operator)Ternary operator FQL doesn’t have a ternary (conditional) operator. You can get the same result using an [`if …​ else`](../statements/#if) statement. For example, to perform an upsert: ```fql // Customer email to look up let email = "alice.appleseed@example.com" // Customer data to upsert let data = { name: "Alice Appleseed", email: "alice.appleseed@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } } // Try to find the existing customer by email. // If the customer doesn't exist, returns `null`. let customer = Customer.byEmail(email).first() // Create or update the customer based on existence. // If customer is null, create a new customer. // Otherwise, update the existing one. if (customer == null) { Customer.create(data) } else { customer!.update(data) } ``` ### [](#null-checking)Null checking If you’re checking for a null value, you can use the [null coalescing (`??`)](#null-coalescing) operator to return a default value when an expression is `null`. For example: ```fql // Try to find the existing customer by email. // If the customer doesn't exist, return `null`. let customer = Customer.byEmail("carol.clark@example.com")?.first() // Use the null coalescing (??) operator to return the customer's // cart. If the customer or their cart is `null`, return `"Not found"`. // In this case, the customer's cart is `null`. customer?.cart ?? "Not found" // returns "Not found" ``` # Operator precedence An expression is evaluated in the order determined by operator precedence, and precedence is meaningful only if the expression includes multiple operators. The higher precedence operator is evaluated first. Operand grouping can be enforced by using parentheses. Associativity rules are applied to the order of operation for operators that have the same precedence. Precedence order, from highest to lowest, and associative are listed here: | Description | Operators and syntactic elements | Associativity | | --- | --- | --- | --- | --- | | field access, optional chaining, non-null assertion,function call | ., =>, (), ?., ! | left-to-right,function call: n/a | | unary, logical NOT | -, ! | n/a | | exponentiation | ** | right-to-left | | multiplication, division, modulo | *, /, % | left-to-right | | addition, subtraction | +, - | left-to-right | | bitwise AND | & | left-to-right | | bitwise XOR | ^ | left-to-right | | bitwise OR | | | left-to-right | | is-a comparison | isa | left-to-right | | comparison | >, <, >=, <= | left-to-right | | equality | ==, != | left-to-right | | logical AND | && | left-to-right | | logical OR | || | left-to-right | | Null coalescing | ?? | left-to-right | # Expressions and statements A query is a series of zero or more statements and expressions separated by newlines or semicolons. Multiple statements and expressions in a [block](../blocks/) must end with an expression. Statements and expressions can span multiple lines, and there is no line continuation character. ## [](#at)`at` Get the result of an expression at a given time. ### [](#syntax)Syntax ```fql-sig at (timestamp: Time | TransactionTime) expression ``` ### [](#description)Description The at expression gets the result of an expression at a specified timestamp, provided as a [Time](../../fql-api/time/) or [TransactionTime](../../fql-api/transactiontime/). You can use an `at` expression to run a query on the [historical snapshot](../../../learn/doc-history/) of one or more documents. This is called a [temporal query](../../../learn/doc-history/). #### [](#minimum-viable-timestamp-mvt)Minimum viable timestamp (MVT) The minimum viable timestamp (MVT) is the earliest point in time that you can query a collection’s document history. The MVT is calculated as the query timestamp minus the collection’s [`history_days`](../../../learn/doc-history/#history-retention) setting: ``` MVT = query timestamp - collection's `history_days` ``` [Temporal queries](../../../learn/doc-history/#temporal-query) using an `at` expression can’t access document snapshots that are older than the MVT. Any query that attempts to access a document snapshot before the MVT returns an error with the `invalid_request` [error code](../../http/reference/errors/) and a 400 HTTP status code: ``` { "code": "invalid_request", "message": "Requested timestamp 2099-01-09T00:36:53.334372Z less than minimum allowed timestamp 2099-01-10T00:21:53.334372Z." } ``` For example, if a collection has `history_days` set to `3` and you run a query that attempts to access a document in the collection from 4 days ago, the query returns an error. ### [](#examples)Examples The following example gets the current document and a snapshot of the document from yesterday. 1. After running this query yesterday to create a document, ```fql Product.create({ id: "9780547928227", name: "lemon", state: "yesterday state" }) ``` ``` { id: "9780547928227", coll: Product, ts: Time("2099-04-10T16:22:32.420Z"), name: "lemon", state: "yesterday state" } ``` 2. And running this query today to update the document, ```fql Product.byId("9780547928227")?.updateData({ state: "today state" }) ``` ``` { id: "9780547928227", coll: Product, ts: Time("2099-04-10T16:23:03.520Z"), name: "lemon", state: "today state" } ``` 3. The following query returns the current data, ```fql Product.byId("9780547928227")?.state ``` ``` "today state" ``` 4. And the following query returns the data from yesterday, ```fql let yesterday = Time.now().subtract(1, "day") at (yesterday) { Product.byId("9780547928227")?.state } ``` ``` "yesterday state" ``` The following examples show that when comparing documents using the `==` operator, only the document `id` field is compared, ignoring all other fields. Compare the `state` fields of different versions of the same document, which differ because of the state change between yesterday and today: ```fql let yesterday = Time.now().subtract(1, "day") let product1 = at(yesterday) { Product.byId('9780547928227') } let product2 = Product.byId('9780547928227') product1?.state == product2?.state ``` ``` false ``` Compare versions of the full document. The documents are identical because the document `id` fields are the same: ```fql let yesterday = Time.now().subtract(1, "day") let product1 = at(yesterday) { Product.byId('9780547928227') } let product2 = Product.byId('9780547928227') product1 == product2 ``` ``` true ``` ## [](#if)`if …​ else` Use conditional branching for execution control flow. ### [](#syntax-2)Syntax ```fql-sig if (__expression1__) __expression2__ [else __expression3__] ``` ### [](#description-2)Description The `if` and `if …​ else` expressions conditionally execute a block depending on the [Boolean](../types/#boolean) value of _expression1_. If _expression1_ evaluates to `true`, _expression2_ executes. Otherwise, _expression3_ executes. The last expression in the `if` or `else` block that satisfies the condition is the value returned for the block. The result of the `if` expression is a value that can be assigned to a variable. If the expression evaluates to `false` and `else` isn’t included, the expression returns `null`. ### [](#examples-2)Examples Change the query result when the `if` condition evaluates to `true`: ```fql if (5 > 3) "higher" ``` ``` "higher" ``` Use an `else` block if the condition evaluates to `false`: ```fql if (5 > 6) { "higher" } else { "lower" } ``` ``` "lower" ``` Assign the result of the `if` expression to a variable: ```fql let state = if (5 > 6) { "higher" } else { "lower" } state ``` ``` "lower" ``` ### [](#ternary-operator)Ternary operator FQL doesn’t have a ternary (conditional) operator. You can get the same result using an [`if …​ else`](#if) statement. For example, to perform an upsert: ```fql // Customer email to look up let email = "alice.appleseed@example.com" // Customer data to upsert let data = { name: "Alice Appleseed", email: "alice.appleseed@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } } // Try to find the existing customer by email. // If the customer doesn't exist, returns `null`. let customer = Customer.byEmail(email).first() // Create or update the customer based on existence. // If customer is null, create a new customer. // Otherwise, update the existing one. if (customer == null) { Customer.create(data) } else { customer!.update(data) } ``` #### [](#null-checking)Null checking If you’re checking for a null value, you can use the [null coalescing (`??`)](../operators/#null-coalescing) operator to return a default value when an expression is `null`. For example: ```fql // Try to find the existing customer by email. // If the customer doesn't exist, return `null`. let customer = Customer.byEmail("carol.clark@example.com")?.first() // Use the null coalescing (??) operator to return the customer's // cart. If the customer or their cart is `null`, return `"Not found"`. // In this case, the customer's cart is `null`. customer?.cart ?? "Not found" // returns "Not found" ``` ## [](#let)`let` Assign a value to a variable. ### [](#syntax-3)Syntax ```fql-sig let __name__ [: __type__] = __value__ ``` ### [](#description-3)Description A `let` statement declares a variable with a _name_ identifier and assigns a _value_ to the variable. Optionally, you can type the variable as any of the supported [types](../types/). The _name_ can be any valid [identifier](../variables/#identifier) but can’t be a single underscore (\_) character. The variable is scoped to the [block](../blocks/) in which it is declared. A variable can be referenced in a nested block, but a variable declared in a nested block can’t be referenced outside of the block. Assigning a value to a variable can be done only using the `let` statement. You can’t assign a value to _name_ directly: ```fql let x = 100 // valid assignment x = 1000 // invalid assignment ``` A variable of the same _name_ can be declared again in the same block and assigned a new value. It is a new variable declaration, and the previously declared variable can no longer be referenced: ```fql let x = 100 // (A) valid assignment let x = 1000 // (B) valid assignment and (A) can no longer be referenced x // returns 1000 ``` ### [](#examples-3)Examples Declare and assign [Int](../types/#int), [String](../types/#string), and [Array](../types/#array) values to variables: ```fql let xInt = 5 let yString = "hello" let zArray = [xInt, yString] [xInt, yString, zArray] ``` ``` [ 5, "hello", [ 5, "hello" ] ] ``` Variable scoping example where the declared `x` variable has global scope, while the scope of variables `a` and `b` is limited to the block: ```fql let x = { // a and b are only accessible in this block let a = 1 let b = 2 a + b } x ``` ``` 3 ``` Assign a new value to the variable: ```fql let x = "foo" let fn = () => x let x = "bar" let y = fn() [x, y] ``` ``` [ "bar", "foo" ] ``` Use a previous definition of the variable when redefining the variable: ```fql let x = 1 let x = x + 1 x ``` ``` 2 ``` Type a variable: ```fql let x: Int = 2 x ``` ``` 2 ``` Incorrectly typing a variable results in an error: ```fql let x: Int = "2" x ``` ``` invalid_query: The query failed 1 validation check error: Type `"2"` is not a subtype of `Int` at *query*:1:5 | 1 | let x: Int = "2" | ^ | cause: Type `String` is not a subtype of `Int` | 1 | let x: Int = "2" | ^^^ | ``` ### [](#conditional-assignment)Conditional assignment You can use `let` with an [`if …​ else`](#if) statement to conditionally assign a variable value. For example: ```fql let x = "cat" let y = if (x == "cat") { "Meow" } else if (x == "dog") { "Woof" } else { "Call the exterminator" } y // Returns the `y` var. ``` ``` "Meow" ``` # Anonymous functions An anonymous function is a function that doesn’t have a name, and is typically used as an argument to another function. Functions are defined and managed using the [Function](../../fql-api/function/) API. ## [](#declaration-and-scope)Declaration and scope Fauna Query Language (FQL) uses the JavaScript-style short-form arrow syntax to declare an anonymous function. Function declaration has the generalized form: (\[param\[, param\]\]) => {\[statement\[ statement …​\] expression\]} The _param_ can be any valid [identifier](../variables/#identifier) including a single underscore (\_), which can be repeated in the _param_ list. An anonymous function block can include field access constructs, operators, function arguments, and nested blocks. Variables declared in an anonymous function block are scoped to the anonymous function, but the function can access the variables in the parent block. The last expression is the return value of the function. There is no explicit `return` keyword. ## [](#simplified-syntax)Simplified syntax For a function with one parameter, the `()` enclosing the parameter can be omitted. If the function body consists of a single-line expression, the `{}` block can also be omitted. These examples are equivalent and return a value of `6`: ```fql let Double = (x) => {x + x} Double(3) ``` ```fql let Double = x => x + x Double(3) ``` ## [](#predicates)Predicates A predicate is an anonymous, read-only FQL function that evaluates to `true`, `false`, or `null` (which is interpreted as `false`). It acts as a test to determine if one or more conditions are met. Some FQL methods and FSL schema properties accept predicates. For example, [`set.where()`](../../fql-api/set/where/) accepts a predicate as its argument: ```fql Customer.all().where(c => c.address.postalCode == "20220" && c.name == "Alice Appleseed") ``` ## [](#shorthand-syntax)Shorthand syntax Some contexts support a shorthand syntax that lets you omit the parameter name and arrow (`=>`). For example, the following query is equivalent to the previous one: ```fql Customer.all().where(.address.postalCode == "20220" && .name == "Alice Appleseed") ``` The syntax supports dot notation and bracket notation. The following query is equivalent to the previous one: ```fql Customer.all().where(.["address"]["postalCode"] == "20220" && .["name"] == "Alice Appleseed") ``` ## [](#variadic)Variadic arguments Use the `...` syntax to create a variadic function that accepts an indefinite number of arguments, including zero. ```fql let getLength = (...args) => args.length getLength(1, 2, 3) ``` ``` 3 ``` A function can only accept one variadic argument. It must be the last argument. Variadic arguments are collected into an [Array](../types/#array). You can define a type signature to limit the types of values accepted and held in the Array. For example, the following function accepts a single [String](../types/#string) argument followed by a variadic argument of zero or more [Number](../types/#number)s: ```fql let formatCurrency: (String, ...Number) => String = (symbol, ...amounts) => { symbol + amounts.reduce((prev, cur) => prev + cur).toString() } formatCurrency("$", 2, 3) ``` ``` "$5" ``` # Naming and aliasing This section covers guidelines for naming resources. ## [](#collision)Resolving document field name collisions The document metadata `id`, `coll`, and `ts` fields are reserved. If a document must use a reserved word as a field name, it can be nested in the document `data` field to avoid colliding with a reserved name. In the following example, the reserved word `id` is nested in the `data` field to avoid collision: ``` { id: "777", coll: Product, ts: Time("2099-06-25T21:16:36.610Z"), name: "limes", description: "Conventional, 16 oz bag", price: 299, stock: 30, category: Category("789"), data: { id: "limes" } } ``` ## [](#global-namespace-prefix)Global namespace prefix The `FQL` prefix is reserved to disambiguate global Fauna schema entities and native API names from user-defined entities. For example, if you define a UDF named `log()`, you can access the built-in [`log()`](../../fql-api/globals/log/) method using the `FQL` prefix: ```fql FQL.log("hello world") ``` The `FQL` prefix isn’t required if user-defined names don’t conflict with the global Fauna names. ## [](#aliasing)Schema aliasing Collections and UDFs accept an _alias_ field that can then be used to reference a resource with a name that conflicts with a [reserved schema name](../reserved/#reserved-schema). By creating and using an alias, the resource doesn’t have to be renamed. This example shows how to use a schema alias to rename a Collection: ```fsl @alias(myLegacyCollection) collection "my-legacy-collection" { } ``` Using the _alias_ keyword to define an alias name allows you to refer to the aliased entity using the alias name in subsequent requests: ```fql MyLegacyCollection.firstWhere(.name < "Z") ``` If the original name doesn’t conflict with an existing name, the entity is available using its name or the alias. If the name does conflict, the entity is available only by the alias. ## [](#see-also)See also [Reserved words](../reserved/) # Projection and field aliasing Projection allows you to select the fields to be returned and is supported for [Struct](../types/#struct), [Array](../types/#array), [Set](../types/#set), and [Document](../types/#document) types. If you apply projection to a type that doesn’t support projection, the result is an object with the same shape as the projection request but all field values are set to `null`. If a requested field doesn’t exist in the projected object, the returned field value is set to `null`. Fields are returned in the order listed in the query. ## [](#struct-projection)Struct projection The result of projection on a [Struct](../types/#struct) returns a struct with only the requested fields extracted. ```fql let customer = { name: "John Doe", email: "john.doe@example.com", address: { street: "123 Main St", city: "San Francisco", state: "CA", postalCode: "12345", country: "United States" } } customer { name, email, address { street } } ``` ``` { name: "John Doe", email: "john.doe@example.com", address: { street: "123 Main St" } } ``` ## [](#array-projection)Array projection The result of projection on an Array is an Array with the projection applied to each element of the Array: ```fql let stores = [ { name: "DC Fruits", address: { street: "13 Pierstorff Drive", city: "Washington", state: "DC", zipCode: "20220" } }, { name: "Party Supplies", address: { street: "7529 Capitalsaurus Court", city: "Washington", state: "DC", zipCode: "20002" } }, { name: "Foggy Bottom Market", address: { street: "4 Florida Ave", city: "Washington", state: "DC", zipCode: "20037" } } ] stores { name } ``` ``` [ { name: "DC Fruits" }, { name: "Party Supplies" }, { name: "Foggy Bottom Market" } ] ``` ## [](#set)Set projection The result of projection on a [Set](../types/#set) is a new Set where the field selections are applied to each element in the original Set. The result type on a Set is a Set. For example: ```fql Category.all() { name } ``` ``` { data: [ { name: "party" }, { name: "frozen" }, { name: "produce" } ] } ``` You can also use the FQL [`set.map()`](../../fql-api/set/map/) method to project Sets. For example, the following projection query: ```fql Product.sortedByPriceLowToHigh() { name, description, price } ``` Is equivalent to the following [`set.map()`](../../fql-api/set/map/) query: ```fql Product.sortedByPriceLowToHigh().map(prod => { name: prod.name, description: prod.description, price: prod.price, }) ``` ## [](#document-projection)Document projection Applying projection to a document, the projected fields are extracted directly from the document. The value returned from projection on a document is a [Struct](../types/#struct). For example: ```fql Category.byName("produce").first() { name } ``` ``` { name: "produce" } ``` ## [](#resolve-document-references)Resolve document references You can use [document references](../../../learn/data-model/relationships/) to create relationships between documents. The reference acts as a pointer to a document. The reference contains the document’s collection and document ID. For example, the following query updates a `Product` collection document to add a `category` field. The `category` field contains a reference to a `Category` collection document. ```fql let produce = Category.byName("produce").first() Product.byName("limes").first() ?.update({ category: produce }) ``` ``` // An example `Product` collection document. { id: "777", coll: Product, ts: Time("2099-04-10T16:07:02.515Z"), name: "limes", description: "Conventional, 16 oz bag", price: 299, stock: 30, // A `Category` document reference. // The reference contains the document's // collection and document ID. category: Category("789") } ``` When the `category` field is projected, the reference is resolved and the full `Category` collection document is returned: ```fql // Gets a `Product` document and projects the // `name` and `category` fields. Product.byName("limes").first() { name, category} ``` ``` { name: "limes", // The projection resolves the `Category` document // reference in the `category` field. category: { id: "789", coll: Category, ts: Time("2099-07-30T22:17:39.945Z"), products: "hdW...", name: "produce", description: "Fresh Produce" } } ``` ## [](#resolve-set-references)Resolve Set references [Sets](../../../learn/data-model/sets/) are not [persistable](../types/#persistable). You can’t store a Set as a field value or create a [field definition](../../../learn/schema/#field-definitions) that accepts a Set. Instead, you can use a [computed field](../../fsl/computed/) to define a read-only function that dynamically fetches a Set: ```fsl collection Customer { ... // Computed field definition for the `orders` field. // `orders` contains a reference to a Set of `Order` collection documents. // The value is computed using the `Order` collection's // `byCustomer()` index to get the customer's orders. compute orders: Set = ( customer => Order.byCustomer(customer)) ... } ``` If the field isn’t [projected](./), it contains an [`after` pagination cursor](../../../learn/query/pagination/#cursor) that references the Set: ```fql // Get a `Customer` document. Customer.byEmail("alice.appleseed@example.com").first() ``` ``` { id: "111", coll: Customer, ts: Time("2099-10-22T21:56:31.260Z"), cart: Order("412483941752112205"), // `orders` contains an `after` cursor that // references the Set of `Order` documents. orders: "hdW...", name: "Alice Appleseed", email: "alice.appleseed@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } } ``` To materialize the Set, [project](./) the computed field: ```fql let customer = Customer .where(.email == "alice.appleseed@example.com") .first() // Project the `name`, `email`, and `orders` fields. customer { name, email, orders } ``` ``` { name: "Alice Appleseed", email: "alice.appleseed@example.com", orders: { data: [ { id: "412483941752112205", coll: Order, ts: Time("2099-10-22T21:56:31.260Z"), items: "hdW...", total: 5392, status: "cart", customer: Customer("111"), createdAt: Time("2099-10-22T21:56:31.104083Z"), payment: {} }, ... ] } } ``` Alternatively, you can pass the `after` cursor to [`Set.paginate()`](../../fql-api/set/static-paginate/): ```fql Set.paginate("hdW...", 2) ``` ``` { // Returns a materialized Set of `Order` documents. data: [ { id: "412483941752112205", coll: Order, ts: Time("2099-10-22T21:56:31.260Z"), items: "hdW...", total: 5392, status: "cart", customer: Customer("111"), createdAt: Time("2099-10-22T21:56:31.104083Z"), payment: {} }, ... ] } ``` ## [](#field-aliasing)Field aliasing with anonymous field access You can use field aliasing to rename or combine projected fields in results. To create a field alias, use a valid [identifier](../variables/#identifier) as the key and a [field accessor](../dot-notation/#field-accessor) in dot notation as the field value. Similar to [shorthand function syntax](../functions/#shorthand-syntax), omit the parent document identifier from the field accessor value. This syntax is called anonymous field access. Example projecting a nested field: 1. Given the following document: ```fql Customer.byEmail("alice.appleseed@example.com").first() ``` ``` { id: "111", coll: Customer, ts: Time("2099-06-25T12:14:29.440Z"), cart: Order("412653216549831168"), orders: "hdW...", name: 'Alice Appleseed', email: 'alice.appleseed@example.com', address: { street: '87856 Mendota Court', city: 'Washington', state: 'DC', postalCode: '20220', country: 'US' } } ``` 2. Define an alias using dot notation to get the value of the projected field: ```fql Customer.byEmail("alice.appleseed@example.com").first() { myCity: .address.city } ``` ``` { myCity: "Washington" } ``` Example projecting multiple fields: 1. Given the following document: ```fql Customer.byEmail("alice.appleseed@example.com").first() ``` ``` { id: "111", coll: Customer, ts: Time("2099-06-25T12:14:29.440Z"), cart: Order("412653216549831168"), orders: "hdW...", name: 'Alice Appleseed', email: 'alice.appleseed@example.com', address: { street: '87856 Mendota Court', city: 'Washington', state: 'DC', postalCode: '20220', country: 'US' } } ``` 2. Define an alias that references multiple fields to get the value of the projected field. In the following example, the `cityState` alias combines the `city` and `state` fields from the `address` field into a single string. ```fql Customer.byEmail("alice.appleseed@example.com").first() { cityState: .address.city + ", " + .address.state } ``` ``` { cityState: "Washington, DC" } ``` ## [](#dynamic-projection)Dynamic projection You can dynamically project values in one of two ways: * [Driver composition](#driver-comp) * [User-defined functions (UDFs)](#udf) ### [](#driver-comp)Dynamic projection with driver composition The [Fauna client drivers](../../../build/drivers/) compose queries using FQL template strings. You can interpolate variables, including other FQL template strings, into the template strings to compose dynamic queries. You can use an FQL template string to store a projection partial and then interpolate it into an FQL query. For example, using the Fauna [JavaScript driver](../../../build/drivers/js-client/): ```javascript const PRODUCT_PROJECTION = fql`{ name, description, price, category { name, description } }` const query = fql`Product.all() ${PRODUCT_PROJECTION}` ``` ### [](#udf)Dynamic projection with UDFs You can use [user-defined functions (UDFs)](../../../learn/schema/user-defined-functions/) to dynamically project values. For example, you can create a generic UDF that accepts two arguments: * A collection name * An anonymous function containing a projection for the entire collection’s Set The UDF uses [`set.map()`](../../fql-api/set/map/) to project the results. ```fsl function getList(collName, formatFn) { let collection = Collection(collName) // Uses `map()` to project the results. collection.all().map(formatFn) } ``` When called, the UDF calls [`collection.all()`](../../fql-api/collection/instance-all/) and returns the projection from the function argument: ```fql getList("Product", product => product { name, description, price }) // The above function call is equivalent to // the following FQL query: // // let collection = Collection("Product") // Product.all().map(product => product { // name, // description, // price // }) ``` ``` { data: [ { name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698 }, ... ] } ``` For more granular control, you can define a companion `getFormatter()` UDF that returns a predefined format based on the collection name. If no predefined format exists for the collection name, `getFormatter()` returns an [abort error](../../fql-api/globals/abort/#error). You can then update the previously defined `getList()` UDF to call `getFormatter()`. ```fsl // Defines the `getFormatter()` UDF. // Accepts a collection name and returns a // format for the collection. function getFormatter(collName) { // Defines an object with a format // for each accepted collection name. let formatterMap = { Product: product => product { name, description, price }, Category: category => category { name, description } } // Use abort() to return an error if the // collection name doesn't have a format. if (!Object.hasPath(formatterMap, [collName])) { abort("No formatter named '#{collName}'") } formatterMap[collName] } // Updates the getList() UDF. function getList(collName) { let collection = Collection(collName) // Calls the previous `getFormatter()` UDF. let formatFn = getFormatter(collName) collection.all().map(formatFn) } ``` When called, the UDF returns data from the collection in a predefined format: ```fql getList("Product") ``` ``` { data: [ { name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698 }, ... ] } ``` # Static typing FQL supports static typing of queries and user-defined functions. Static typing identifies potential query errors before execution by validating the query _shape_ before running the query to rule out runtime errors. FQL verifies that the query is correct for the kinds of values that it can accept and runs the query after type checking passes. Types are inferred by usage. After a type is determined, usage must be consistent. ## [](#types)Types A _type_ is the type of a [signature](#signatures) value and can be any of the following: | Type | Description | Syntax | Examples | | --- | --- | --- | --- | --- | --- | | named | A named type has a name. Types, such as Boolean or String, are named types. | name | See a partial list of named types. | | literal | A literal value has its own type. For example, 3 has the type 3. | literal value | 1, true, { a: 3 }, "foo" | | function | A function type doesn’t have a name. It has a list of arguments and a return type. For example, the function concat() has the type (other: string) => string. These types are used for anonymous functions. For example, the where() method accepts a function type as a parameter. | (arg_name: arg_type) => return_type | () => String, (start: Int, end: Int) => String | | union | A union represents an OR between any number of types. For example, if a function accepts an integer or a string, it accepts the type Int | String. | type1 | type2 | type3 | Int | String, Document | Null | | intersection | An intersection represents an AND between any number of types, combining multiple types into a single type. For example, the types { name: string } and { address: string } can be combined into a single { name: string, address: string } type by using the { name: string } & { address: string } notation. | type1 & type2 & type3 | Int & Number, { name: string } & { address: string } | The following is a partial list of named types. | Type | Example value | Notes | | --- | --- | --- | --- | --- | | Boolean | true, false | | | String | "hello world", "" | | | Number | 3, 9223372036854775807, 2.5 | | | Int | 3 | Int is a Number type. | | Long | 9223372036854775807 | Long is a Number type. | | Double | 2.5 | Double is a Number type. | | Array | [1, 2, "hi", true] | The type parameter A is the type of the values in the Array. | | Set | Collection.all() | The type parameter A is the type of the values in the Set. | | Time | Time.now() | | | Date | Date("2099-04-05") | | ## [](#generic)Generic types A _type_ can be a _concrete_ or _generic_ type. A _generic type_ is a type that has type parameters. A _type parameter_ is similar to a named type but isn’t known in a function signature and is, effectively, a placeholder type. The type parameter signature is a single capital letter in sequence, such as `A`, `B`, and `C`. These types resolve to a _concrete type_ after the value is used. For example, the elements in an Array have the type `A` in signatures because an array can store any type. The type parameter `A` resolves to a concrete type after the Array is constructed. ## [](#signatures)Signatures A _signature_ is the definition of a field or function in the static environment. FQL method signatures are a simplified form of TypeScript and can be in one of the two following formats: * Function signatures have the form name(arg\_name: arg\_type) => return\_type. For example, concat has the signature `concat(other: String) => String`. Function signatures are similar to function types, but include a name on the left. * Field signatures have the form name: type. For example, the field year on Date has the signature `year: Int`. _Type parameters_ act like placeholders in a signature. This means that `Array` isn’t a static type because it has a type parameter. When you construct an Array, for example, with the query `[1, 2]`, the concrete type of that Array is `Array<1 | 2>`, because the value of the Array is determined to have the type `1 | 2`. The type parameter `A` is then substituted on calling functions on that array. For example, the function `first()` on an Array has the signature `first() => A | Null`. This means that it returns a value that is the same type as the elements of the Array, `A`, or a null value. After `first()` is called, for example, in `[1, 2].first()`, the type `A` is resolved. In this case, `first()` has the concrete type `() => 1 | 2 | Null` because the type `A` resolves to `1 | 2`. Type parameters on functions are defined implicitly, in alphabetical order. For example, `dbg()` accepts a value and returns that same value. So, the signature is `dbg(value: A) => A`. The type parameter `A` is local to the `dbg()` function and is defined implicitly. The `dbg()` function is called a generic function because it has type parameters. ### [](#generic-type-signature)Generic type signature For types that include parameters, new type parameters start at `B`, followed by `C`, and continue. For example, `concat()` on Arrays has the signature `concat(other: Array) => Array`. The type parameter `A` is the type of the Array this is called on, and the type parameter `B` is a new type parameter local to this function. The type `Array` is a [generic type](#generic) because it has type parameters. ### [](#concrete-type-signature)Concrete type signature A concrete type is the type of a query value. Here are some examples: | Query value | Type signature | | --- | --- | --- | --- | | [1, 2] | Array<1 | 2> | | Collection.all() | Set | | Time.now | () ⇒ Time | | "hello".slice(2, 4) | String | | (num) ⇒ "hello".slice(num) | (num: Int) ⇒ String | ### [](#udf-signature)UDF signature The user-defined function (UDF) signature field can be used to define an explicitly typed signature for a UDF. For example, ```fsl function TypeTest(x: Number, y: Number): Number { x + y } ``` ## [](#enable-and-disable-type-checking)Enable and disable type checking | Scope | Property | Description | | --- | --- | --- | --- | --- | | Database | typechecked | Enable or disable type checking for the database:true = Enable type checking.false = (default) Disable type checking. | | Query | typecheck | Enable or disable type checking per query:true = Enable type checking.false = (default) Disable type checking. | If type checking is enabled for the driver or per query, type checking must also be enabled in the database. Setting the _typechecked_ property in the driver, query, or Dashboard overrides the database setting. Type checking is performed on user-defined function (UDF) definitions and can’t be disabled. If a UDF definition fails type checking it results in a `QueryRuntimeError`. This differs from the compile-time `TypeCheckError` returned by a query type error. Disabling type checking can reduce the query time, and the number of query check phase compute operations but can allow errors to present as runtime errors. ### [](#enable-database-type-checking)Enable database type checking Enabled type checking for a _child database_ by updating the database definition using the [`document.update()`](../../fql-api/document/update/) method: ```fql Database.byName( "childDB" )!.update( { typechecked: true } ) ``` Setting the `typecheck` property using the Dashboard, [driver](#enable-for-query), or [query](#enable-for-query) option overrides the database setting. ### [](#enable-for-query)Enable query type checking Drivers for the [supported languages](../../../build/drivers/) can be configured to enable type checking at two levels: * Enable type checking of all queries sent by the driver by setting the `typecheck` property to `true` in the driver client configuration. * Enable type checking per-query by setting the `typecheck` property to `true` in the query options field. ## [](#check-a-values-type)Check a value’s type Use the [`isa`](../operators/#isa) operator to check if a value is of a specific FQL type. For example: ```fql "foo" isa String // true 123 isa String // false 123 isa Int // true 0.123 isa Double // true 123 isa Number // true 0.123 isa Number // true { a: "foo", b: "bar" } isa Object // true [ 1, 2, 3 ] isa Array // true Product.all() isa Set // true // For documents, the type is the name // of the collection. For example, `Product` //collection documents have a type of `Product.` Product.byName('limes').first() isa Product // true // Document references also resolve to the // document type. Product.byId('111') isa Product // true // Dangling references, which point to documents // that don't exist, still resolve to the // document type. For example, the following // document doesn't exist. Product.byId('999') isa Product // true ``` `isa` doesn’t support [parameterized generic types](#generic), such as `Ref` or `Array`. Queries that attempt to use `isa` with such types return an `invalid_query` error: ```fql // NOT SUPPORTED: [ 1, 2, 3 ] isa Array // NOT SUPPORTED: Product.all() isa Set // NOT SUPPORTED: Product.byId('111') isa Ref ``` # FQL API reference The API reference documentation gives you detailed syntactic information for working with FQL database entities. Querying, data manipulation, and other capabilities are exposed as APIs on top of the core syntax. This gives you an easy-to-learn language with a rich feature set that remains discoverable over time. Collections expose an ORM-like API for creating declarative queries, with composable methods for filtering, ordering, and transforming sets of documents as part of core FQL. The core API is enhanced with dedicated syntax to optimize the readability of predicates and projection/transformation. Working with documents is as simple as reading fields off objects. FQL support for first-class foreign keys simplifies them by dereferencing documents through associated fields. For example, if a book document has an author stored in an `author` field, the author document is retrieved transparently through field access. FQL includes a powerful indexing system that enables an iterative approach to developer-driven query optimization. These capabilities mean that it is easy to build sophisticated join-like queries that remain easy to work with as they scale in complexity, and queries can be tailored to return the exact data the application requires. By convention, the method and field signatures described in this reference have the following notation conventions. Method signature: ```fql-sig ([: [, ...]]) => ``` Field signature: ```fql-sig : ``` For more information on typing, see: [Types](../fql/types/) [Static typing](../fql/static-typing/) # FQL cheat sheet This page provides a quick reference of FQL properties and methods, grouped by functionality. | Access providerAccessProvider.all()Get a Set of all access providers.AccessProvider.byName()Get an access provider by its name.AccessProvider.create()Create an access provider.AccessProvider.firstWhere()Get the first access provider that matches a provided predicate.AccessProvider.toString()Get "AccessProvider" as a String.AccessProvider.where()Get a Set of access providers that match a provided predicate.accessProvider.delete()Delete an access provider.accessProvider.exists()Test if an access provider exists.accessProvider.replace()Replace an access provider.accessProvider.update()Update an access provider.ArrayArray.sequence()Create an ordered Array of Numbers given start and end values.array.lengthThe number of elements in the Array.array.aggregate()Aggregate all elements of an Array.array.any()Test if any element of an Array matches a provided predicate.array.append()Append a provided element to an Array.array.at()Get the Array element at a provided index.array.concat()Concatenate two Arrays.array.distinct()Get the unique elements of an Array.array.drop()Drop the first N elements of an Array.array.entries()Add the index to each element of an Array.array.every()Test if every element of an Array matches a provided predicate.array.filter()Filter an Array using a provided predicate.array.first()Get the first element of an Array.array.firstWhere()Get the first element of an Array that matches a provided predicate.array.flatMap()Apply a provided function to each Array element and flatten the resulting Array by one level.array.flatten()Flatten an Array by one level.array.fold()Reduce the Array to a single, accumulated value by applying a provided function to each element. Iterates through elements from left to right. Uses a provided seed as the initial value.array.foldRight()Reduce an Array to a single, accumulated value by applying a provided function to each element. Iterates through elements from right to left. Uses a provided seed as the initial value.array.forEach()Run a provided function on each element of an Array. Can perform writes.array.includes()Test if the Array includes a provided element.array.indexOf()Get the index of the first Array element that matches a provided value.array.indexWhere()Get the index of the first Array element that matches a provided predicate.array.isEmpty()Test if an Array is empty.array.last()Get the last element of an Array.array.lastIndexOf()Get the index of the last Array element that matches a provided value.array.lastIndexWhere()Get the index of the last Array element that matches a provided predicate.array.lastWhere()Get the last element of an Array that matches a provided predicate.array.map()Apply a provided function to each element of an Array. Can’t perform writes.array.nonEmpty()Test if an Array is not empty.array.order()Sort an Array's elements.array.prepend()Prepend an element to an Array.array.reduce()Reduce an Array to a single, accumulated value by applying a provided function to each element. Iterates through elements from left to right. Uses the first element as the initial value.array.reduceRight()Reduce an Array to a single, accumulated value by applying a provided function to each element. Iterates through elements from right to left. Uses the first element as the initial value.array.reverse()Reverse the order of an Array's elements.array.slice()Get a subset of an Array's elements based on provided indexes.array.take()Get the first N elements of an Array.array.toSet()Convert an Array to a Set.array.toString()Convert an Array to a String.array.where()Get the elements of an Array that match a provided predicate.BytesBytes()Convert a Base64-encoded string to an FQL Bytes value.Bytes.fromBase64()Convert a Base64-encoded string to an FQL Bytes value.bytes.toBase64()Convert an FQL Bytes value to a Base64-encoded string.bytes.toString()Convert an FQL Bytes value to a Base64-encoded string.Collectioncollection.definitionGet a collection definition, represented as a Collection document with the CollectionDef type.Collection()Access a collection by its name.Collection.all()Get a Set of all collection definitions.Collection.byName()Get a collection definitions by its name.Collection.create()Create a collection.Collection.firstWhere()Get the first collection definition that matches a provided predicate.Collection.toString()Get "Collection" as a String.Collection.where()Get a Set of collection definitions that match a provided predicate.collectionDef.delete()Delete a collection.collectionDef.exists()Test if a collection exists.collectionDef.replace()Replaces a collection definition.collectionDef.update()Update a collection definition.collection.all()Get a Set of all documents in a collection.collection.byId()Get a collection document by its document id.collection.create()Create a collection document.collection.firstWhere()Get the first collection document that matches a provided predicate.collection.indexName()Call an index as a method to get a Set of matching collection documents.collection.where()Get a Set of collection documents that match a provided predicate.CredentialCredential.all()Get a Set of all credentials.Credential.byDocument()Get a credential by its identity document.Credential.byId()Get a credential by its document id.Credential.create()Create a credential.Credential.firstWhere()Get the first credential that matches a provided predicate.Credential.toString()Get "Credential" as a String.Credential.where()Get a Set of credentials that match a provided predicate.credential.delete()Delete a credential.credential.exists()Test if a credential exists.credential.login()Create a token for a provided credential and its password.credential.replace()Replace a credential.credential.update()Update a credential.credential.verify()Test whether a provided password is valid for a credential.DatabaseDatabase.all()Get a Set of all child databases nested directly under the database.Database.byName()Get a child database by its name.Database.create()Create a child database.Database.firstWhere()Get the first child database document that matches a provided predicate.Database.toString()Get "Database" as a String.Database.where()Get a Set of child databases that match a provided predicate.database.delete()Deletes a child database.database.exists()Test if a child database exists.database.replace()Replace a child database's metadata and settings.database.update()Update a child database's metadata and settings.DatedayOfMonthGet the day of the month from a Date.dayOfWeekGet the day of the week from a Date.dayOfYearGet the day of the year from a Date.monthGet the month of a Date.yearGet the year of a Date.Date()Construct a Date from a ISO 8601 date String.Date.fromString()Construct a Date from a date String.Date.today()Get the current UTC Date.date.add()Add number of days to a Date.date.difference()Get the difference between two Dates.date.subtract()Subtract number of days from a Date.date.toString()Convert a Date to a String.Documentdocument.delete()Delete a collection document.document.exists()Test if a collection document exists.document.replace()Replace all fields in a collection document.document.update()Update a collection document's fields.EventSourceeventSource.map()Apply an anonymous function to each element of an event source's tracked Set.eventSource.toString()Get "[event source]" as a string.eventSource.where()Create an event source that emits events for a subset of another event source’s tracked Set.Functionfunction.definitionGet or update a user-defined function (UDF)'s definition, represented as a Function document.Function()Call a user-defined function (UDF) by its name.Function.all()Get a Set of all user-defined functions (UDFs).Function.byName()Get a user-defined function (UDF) by its name.Function.create()Create a user-defined function (UDF).Function.firstWhere()Get the first user-defined function (UDF) that matches a provided predicate.Function.toString()Get "Function" as a String.Function.where()Get a Set of user-defined functions (UDFs) that match a provided predicate.functionDef.delete()Delete a user-defined function (UDF).functionDef.exists()Test if a user-defined function (UDF) exists.functionDef.replace()Replace a user-defined function (UDF).functionDef.update()Update a user-defined function (UDF).Global functionsabort()End the current query and return an abort error with a user-defined abort value.dbg()Output a debug message in the query summary and return the message in the query results.ID()Create a valid IDlog()Output a log message in the query summary and return null.newId()Get a unique string-encoded 64-bit integer.KeyKey.all()Get a Set of all keys.Key.byId()Get a key by its document id.Key.create()Create a key.Key.firstWhere()Get the first key that matches a provided predicate.Key.toString()Get "Key" as a String.Key.where()Get a Set of keys that match a provided predicate.key.delete()Delete a key.key.exists()Test if a key exists.key.replace()Replace a key.key.update()Update a key. | AccessProvider.all() | Get a Set of all access providers. | AccessProvider.byName() | Get an access provider by its name. | AccessProvider.create() | Create an access provider. | AccessProvider.firstWhere() | Get the first access provider that matches a provided predicate. | AccessProvider.toString() | Get "AccessProvider" as a String. | AccessProvider.where() | Get a Set of access providers that match a provided predicate. | accessProvider.delete() | Delete an access provider. | accessProvider.exists() | Test if an access provider exists. | accessProvider.replace() | Replace an access provider. | accessProvider.update() | Update an access provider. | Array.sequence() | Create an ordered Array of Numbers given start and end values. | array.length | The number of elements in the Array. | array.aggregate() | Aggregate all elements of an Array. | array.any() | Test if any element of an Array matches a provided predicate. | array.append() | Append a provided element to an Array. | array.at() | Get the Array element at a provided index. | array.concat() | Concatenate two Arrays. | array.distinct() | Get the unique elements of an Array. | array.drop() | Drop the first N elements of an Array. | array.entries() | Add the index to each element of an Array. | array.every() | Test if every element of an Array matches a provided predicate. | array.filter() | Filter an Array using a provided predicate. | array.first() | Get the first element of an Array. | array.firstWhere() | Get the first element of an Array that matches a provided predicate. | array.flatMap() | Apply a provided function to each Array element and flatten the resulting Array by one level. | array.flatten() | Flatten an Array by one level. | array.fold() | Reduce the Array to a single, accumulated value by applying a provided function to each element. Iterates through elements from left to right. Uses a provided seed as the initial value. | array.foldRight() | Reduce an Array to a single, accumulated value by applying a provided function to each element. Iterates through elements from right to left. Uses a provided seed as the initial value. | array.forEach() | Run a provided function on each element of an Array. Can perform writes. | array.includes() | Test if the Array includes a provided element. | array.indexOf() | Get the index of the first Array element that matches a provided value. | array.indexWhere() | Get the index of the first Array element that matches a provided predicate. | array.isEmpty() | Test if an Array is empty. | array.last() | Get the last element of an Array. | array.lastIndexOf() | Get the index of the last Array element that matches a provided value. | array.lastIndexWhere() | Get the index of the last Array element that matches a provided predicate. | array.lastWhere() | Get the last element of an Array that matches a provided predicate. | array.map() | Apply a provided function to each element of an Array. Can’t perform writes. | array.nonEmpty() | Test if an Array is not empty. | array.order() | Sort an Array's elements. | array.prepend() | Prepend an element to an Array. | array.reduce() | Reduce an Array to a single, accumulated value by applying a provided function to each element. Iterates through elements from left to right. Uses the first element as the initial value. | array.reduceRight() | Reduce an Array to a single, accumulated value by applying a provided function to each element. Iterates through elements from right to left. Uses the first element as the initial value. | array.reverse() | Reverse the order of an Array's elements. | array.slice() | Get a subset of an Array's elements based on provided indexes. | array.take() | Get the first N elements of an Array. | array.toSet() | Convert an Array to a Set. | array.toString() | Convert an Array to a String. | array.where() | Get the elements of an Array that match a provided predicate. | Bytes() | Convert a Base64-encoded string to an FQL Bytes value. | Bytes.fromBase64() | Convert a Base64-encoded string to an FQL Bytes value. | bytes.toBase64() | Convert an FQL Bytes value to a Base64-encoded string. | bytes.toString() | Convert an FQL Bytes value to a Base64-encoded string. | collection.definition | Get a collection definition, represented as a Collection document with the CollectionDef type. | Collection() | Access a collection by its name. | Collection.all() | Get a Set of all collection definitions. | Collection.byName() | Get a collection definitions by its name. | Collection.create() | Create a collection. | Collection.firstWhere() | Get the first collection definition that matches a provided predicate. | Collection.toString() | Get "Collection" as a String. | Collection.where() | Get a Set of collection definitions that match a provided predicate. | collectionDef.delete() | Delete a collection. | collectionDef.exists() | Test if a collection exists. | collectionDef.replace() | Replaces a collection definition. | collectionDef.update() | Update a collection definition. | collection.all() | Get a Set of all documents in a collection. | collection.byId() | Get a collection document by its document id. | collection.create() | Create a collection document. | collection.firstWhere() | Get the first collection document that matches a provided predicate. | collection.indexName() | Call an index as a method to get a Set of matching collection documents. | collection.where() | Get a Set of collection documents that match a provided predicate. | Credential.all() | Get a Set of all credentials. | Credential.byDocument() | Get a credential by its identity document. | Credential.byId() | Get a credential by its document id. | Credential.create() | Create a credential. | Credential.firstWhere() | Get the first credential that matches a provided predicate. | Credential.toString() | Get "Credential" as a String. | Credential.where() | Get a Set of credentials that match a provided predicate. | credential.delete() | Delete a credential. | credential.exists() | Test if a credential exists. | credential.login() | Create a token for a provided credential and its password. | credential.replace() | Replace a credential. | credential.update() | Update a credential. | credential.verify() | Test whether a provided password is valid for a credential. | Database.all() | Get a Set of all child databases nested directly under the database. | Database.byName() | Get a child database by its name. | Database.create() | Create a child database. | Database.firstWhere() | Get the first child database document that matches a provided predicate. | Database.toString() | Get "Database" as a String. | Database.where() | Get a Set of child databases that match a provided predicate. | database.delete() | Deletes a child database. | database.exists() | Test if a child database exists. | database.replace() | Replace a child database's metadata and settings. | database.update() | Update a child database's metadata and settings. | dayOfMonth | Get the day of the month from a Date. | dayOfWeek | Get the day of the week from a Date. | dayOfYear | Get the day of the year from a Date. | month | Get the month of a Date. | year | Get the year of a Date. | Date() | Construct a Date from a ISO 8601 date String. | Date.fromString() | Construct a Date from a date String. | Date.today() | Get the current UTC Date. | date.add() | Add number of days to a Date. | date.difference() | Get the difference between two Dates. | date.subtract() | Subtract number of days from a Date. | date.toString() | Convert a Date to a String. | document.delete() | Delete a collection document. | document.exists() | Test if a collection document exists. | document.replace() | Replace all fields in a collection document. | document.update() | Update a collection document's fields. | eventSource.map() | Apply an anonymous function to each element of an event source's tracked Set. | eventSource.toString() | Get "[event source]" as a string. | eventSource.where() | Create an event source that emits events for a subset of another event source’s tracked Set. | function.definition | Get or update a user-defined function (UDF)'s definition, represented as a Function document. | Function() | Call a user-defined function (UDF) by its name. | Function.all() | Get a Set of all user-defined functions (UDFs). | Function.byName() | Get a user-defined function (UDF) by its name. | Function.create() | Create a user-defined function (UDF). | Function.firstWhere() | Get the first user-defined function (UDF) that matches a provided predicate. | Function.toString() | Get "Function" as a String. | Function.where() | Get a Set of user-defined functions (UDFs) that match a provided predicate. | functionDef.delete() | Delete a user-defined function (UDF). | functionDef.exists() | Test if a user-defined function (UDF) exists. | functionDef.replace() | Replace a user-defined function (UDF). | functionDef.update() | Update a user-defined function (UDF). | abort() | End the current query and return an abort error with a user-defined abort value. | dbg() | Output a debug message in the query summary and return the message in the query results. | ID() | Create a valid ID | log() | Output a log message in the query summary and return null. | newId() | Get a unique string-encoded 64-bit integer. | Key.all() | Get a Set of all keys. | Key.byId() | Get a key by its document id. | Key.create() | Create a key. | Key.firstWhere() | Get the first key that matches a provided predicate. | Key.toString() | Get "Key" as a String. | Key.where() | Get a Set of keys that match a provided predicate. | key.delete() | Delete a key. | key.exists() | Test if a key exists. | key.replace() | Replace a key. | key.update() | Update a key. | MathMath.EGet the Euler’s number mathematical constant (℮).Math.InfinityString value representing infinity.Math.NaNValue representing Not-a-Number.Math.PIGet the mathematical constant pi (π).Math.abs()Get the absolute value of a Number.Math.acos()Get the inverse cosine in radians of a Number.Math.asin()Get the inverse sine in radians of a Number.Math.atan()Get the inverse tangent in radians of a Number.Math.ceil()Round up a Number.Math.cos()Get the cosine of a Number in radians.Math.cosh()Get the hyperbolic cosine of a Number.Math.degrees()Convert radians to degrees.Math.exp()Get the value of ℮ raised to the power of a Number.Math.floor()Round down a Number.Math.hypot()Get the hypotenuse of a right triangle.Math.log()Get the natural logarithm, base e, of a Number.Math.log10()Get the base 10 logarithm of a Number.Math.max()Get the larger of two Numbers.Math.mean()Get the arithmetic mean of an Array or Set of Numbers.Math.min()Get the smaller of the input parameter Numbers.Math.pow()Get the value of a base raised to a power.Math.radians()Convert the value of a Number in degrees to radians.Math.round()Get the value of a Number rounded to the nearest integer.Math.sign()Get the sign of a Number.Math.sin()Get the sine of a Number in radians.Math.sinh()Get the hyperbolic sine of a Number.Math.sqrt()Get the square root of a Number.Math.sum()Get the sum of an Array or Set of Numbers.Math.tan()Get the tangent of a Number in radians.Math.tanh()Get the hyperbolic tangent of a Number.Math.trunc()Truncate a Number to a given precision.ObjectObject.assign()Copies properties from a source Object to a destination Object.Object.entries()Convert an Object to an Array of key-value pairs.Object.fromEntries()Convert an Array of key-value pairs to an Object.Object.hasPath()Test if an Object has a property.Object.keys()Get an Object's top-level property keys as an Array.Object.select()Get an Object property’s value by its path.Object.toString()Convert an Object to a String.Object.values()Get an Object's property values as an Array.QueryQuery.identity()Get the identity document for the query’s authentication token.Query.isEnvProtected()Test if the queried database is in protected mode.Query.isEnvTypechecked()Test if the queried database is typechecked.Query.token()Get the Token document or JWT payload for the query’s authentication secret.RoleRole.all()Get a Set of all user-defined roles.Role.byName()Get a user-defined role by its name.Role.create()Create a user-defined role.Role.firstWhere()Get the first user-defined role matching a provided predicate.Role.toString()Get "Role" as a String.Role.where()Get a Set of user-defined roles that match a provided predicate.role.delete()Delete a user-defined role.role.exists()Test if a user-defined role exists.role.replace()Replace a user-defined role.role.update()Update a user-defined role.SchemaFQL.Schema.defForIdentifier()Returns the definition for a user-defined collection or user-defined function (UDF) using the same rules as top-level identifier lookups.SetSet.paginate()Get a page of paginated results using an after cursor.Set.sequence()Create an ordered Set of Numbers given start and end values.Set.single()Create a Set containing a single provided element.set.aggregate()Aggregate all elements of a Set.set.any()Test if any element of a Set matches a provided predicate.set.changesOn()Create an event source that tracks changes to specified document fields in a supported Set.set.concat()Concatenate two Sets.set.count()Get the number of elements in a Set.set.distinct()Get the unique elements of a Set.set.drop()Drop the first N elements of a Set.set.eventsOn()Create an event source that tracks changes to specified document fields in a supported Set.set.eventSource()Create an event source that tracks changes to documents in a supported Set.set.every()Test if every element of a Set matches a provided predicate.set.first()Get the first element of a Set.set.firstWhere()Get the first element of a Set that matches a provided predicate.set.flatMap()Apply a provided function to each Set element and flatten the resulting Set by one level.set.fold()Reduce the Set to a single, accumulated value by applying a provided function to each element. Iterates through elements from left to right. Uses a provided seed as the initial value.set.foldRight()Reduce a Set to a single, accumulated value by applying a provided function to each element. Iterates through elements from right to left. Uses a provided seed as the initial value.set.forEach()Run a provided function on each element of a Set. Can perform writes.set.includes()Test if the Set includes a provided element.set.isEmpty()Test if a Set is empty.set.last()Get the last element of a Set.set.lastWhere()Get the last element of a Set that matches a provided predicate.set.map()Apply a provided function to each element of a Set. Can’t perform writes.set.nonEmpty()Test if a Set is not empty.set.order()Sort a Set's elements.set.pageSize()Set the maximum elements per page in paginated results.set.paginate()Convert a Set to an Object with pagination.set.reduce()Reduce a Set to a single, accumulated value by applying a provided function to each element. Iterates through elements from left to right. Uses the first element as the initial value.set.reduceRight()Reduce a Set to a single, accumulated value by applying a provided function to each element. Iterates through elements from right to left. Uses the first element as the initial value.set.reverse()Reverse the order of a Set's elements.set.take()Get the first N elements of a Set.set.toArray()Convert a Set to an Array.set.toStream()Create an event source that tracks changes to documents in a supported Set.set.toString()Return the string "[set]".set.where()Get the elements of a Set that match a provided predicate.Stringstring.lengthGet a String's length.string.at()Get the character at a specified index of a String.string.casefold()Convert a String to lower case using a specified format.string.concat()Concatenate two Strings.string.endsWith()Test if a String ends with a provided suffix.string.includes()Test if a String includes a provided substring.string.includesRegex()Test if a String contains a substring that matches a provided regular expression.string.indexOf()Get the index of the first matching substring within a String.string.indexOfRegex()Get the index of the first substring matching a provided regular expression within a String.string.insert()Insert a substring into a String at a specified index.string.lastIndexOf()Get the index of the last matching substring within a String.string.matches()Get the substrings in a String that match a provided regular expression.string.matchIndexes()Get the indexes and substrings in a String that match a provided regular expression.string.parseDouble()Convert a String to a Double.string.parseInt()Convert a String to a Int.string.parseLong()Convert a String to a Long.string.parseNumber()Convert a String to a Number.string.replace()Replace a specified number of occurrences of a substring in a String.string.replaceAll()Replace all occurrences of a substring in a String.string.replaceAllRegex()Replace all occurrences of substrings matching a regular expression in a String.string.replaceRegex()Replace a specified number of occurrences of substrings matching a regular expression in a String.string.slice()Get the substring between two indexes of a String.string.split()Split a String at a provided separator.string.splitAt()Split a String at a provided index.string.splitRegex()Split a String using a provided regular expression.string.startsWith()Test if a String starts with a provided prefix.string.toLowerCase()Convert a String to lower case.string.toString()Get a String representation of the value.string.toUpperCase()Convert a String to upper case.TimedayOfMonthGet the day of the month from a Time.dayOfWeekGet the day of the week from a Time.dayOfYearGet the day of the year from a Time.hourGet the hour of a Time.minuteGet the minute of a Time.monthGet the month of a Time.secondGet the second of a Time.yearGet the year of a Time.Time()Construct a Time from an ISO 8601 timestamp String.Time.epoch()Convert a Unix epoch timestamp to a Time.Time.fromString()Construct a Time from an ISO 8601 timestamp String.Time.now()Get the current UTC Time.time.add()Add a time interval to a Time.time.difference()Get the difference between two Times.time.subtract()Subtract a time interval from a Time.time.toMicros()Convert a Time to a Unix epoch timestamp in microseconds.time.toMillis()Convert a Time to a Unix epoch timestamp in milliseconds.time.toSeconds()Convert a Time to a Unix epoch timestamp in seconds.time.toString()Convert a Time to a String.TokenToken.all()Get a Set of all tokens.Token.byDocument()Get a token by its identity document.Token.byId()Get a token by its document id.Token.create()Create a token without a credential or related password.Token.firstWhere()Get the first token that matches a provided predicate.Token.toString()Get "Token" as a String.Token.where()Get a Set of tokens that match a provided predicate.token.delete()Delete a token.token.exists()Test if a token exists.token.replace()Replace a token.token.update()Update a token.Transaction TimeTransactionTime()Get the query transaction time.TransactionTime.toString()Get "[transaction time]" as a String. | Math.E | Get the Euler’s number mathematical constant (℮). | Math.Infinity | String value representing infinity. | Math.NaN | Value representing Not-a-Number. | Math.PI | Get the mathematical constant pi (π). | Math.abs() | Get the absolute value of a Number. | Math.acos() | Get the inverse cosine in radians of a Number. | Math.asin() | Get the inverse sine in radians of a Number. | Math.atan() | Get the inverse tangent in radians of a Number. | Math.ceil() | Round up a Number. | Math.cos() | Get the cosine of a Number in radians. | Math.cosh() | Get the hyperbolic cosine of a Number. | Math.degrees() | Convert radians to degrees. | Math.exp() | Get the value of ℮ raised to the power of a Number. | Math.floor() | Round down a Number. | Math.hypot() | Get the hypotenuse of a right triangle. | Math.log() | Get the natural logarithm, base e, of a Number. | Math.log10() | Get the base 10 logarithm of a Number. | Math.max() | Get the larger of two Numbers. | Math.mean() | Get the arithmetic mean of an Array or Set of Numbers. | Math.min() | Get the smaller of the input parameter Numbers. | Math.pow() | Get the value of a base raised to a power. | Math.radians() | Convert the value of a Number in degrees to radians. | Math.round() | Get the value of a Number rounded to the nearest integer. | Math.sign() | Get the sign of a Number. | Math.sin() | Get the sine of a Number in radians. | Math.sinh() | Get the hyperbolic sine of a Number. | Math.sqrt() | Get the square root of a Number. | Math.sum() | Get the sum of an Array or Set of Numbers. | Math.tan() | Get the tangent of a Number in radians. | Math.tanh() | Get the hyperbolic tangent of a Number. | Math.trunc() | Truncate a Number to a given precision. | Object.assign() | Copies properties from a source Object to a destination Object. | Object.entries() | Convert an Object to an Array of key-value pairs. | Object.fromEntries() | Convert an Array of key-value pairs to an Object. | Object.hasPath() | Test if an Object has a property. | Object.keys() | Get an Object's top-level property keys as an Array. | Object.select() | Get an Object property’s value by its path. | Object.toString() | Convert an Object to a String. | Object.values() | Get an Object's property values as an Array. | Query.identity() | Get the identity document for the query’s authentication token. | Query.isEnvProtected() | Test if the queried database is in protected mode. | Query.isEnvTypechecked() | Test if the queried database is typechecked. | Query.token() | Get the Token document or JWT payload for the query’s authentication secret. | Role.all() | Get a Set of all user-defined roles. | Role.byName() | Get a user-defined role by its name. | Role.create() | Create a user-defined role. | Role.firstWhere() | Get the first user-defined role matching a provided predicate. | Role.toString() | Get "Role" as a String. | Role.where() | Get a Set of user-defined roles that match a provided predicate. | role.delete() | Delete a user-defined role. | role.exists() | Test if a user-defined role exists. | role.replace() | Replace a user-defined role. | role.update() | Update a user-defined role. | FQL.Schema.defForIdentifier() | Returns the definition for a user-defined collection or user-defined function (UDF) using the same rules as top-level identifier lookups. | Set.paginate() | Get a page of paginated results using an after cursor. | Set.sequence() | Create an ordered Set of Numbers given start and end values. | Set.single() | Create a Set containing a single provided element. | set.aggregate() | Aggregate all elements of a Set. | set.any() | Test if any element of a Set matches a provided predicate. | set.changesOn() | Create an event source that tracks changes to specified document fields in a supported Set. | set.concat() | Concatenate two Sets. | set.count() | Get the number of elements in a Set. | set.distinct() | Get the unique elements of a Set. | set.drop() | Drop the first N elements of a Set. | set.eventsOn() | Create an event source that tracks changes to specified document fields in a supported Set. | set.eventSource() | Create an event source that tracks changes to documents in a supported Set. | set.every() | Test if every element of a Set matches a provided predicate. | set.first() | Get the first element of a Set. | set.firstWhere() | Get the first element of a Set that matches a provided predicate. | set.flatMap() | Apply a provided function to each Set element and flatten the resulting Set by one level. | set.fold() | Reduce the Set to a single, accumulated value by applying a provided function to each element. Iterates through elements from left to right. Uses a provided seed as the initial value. | set.foldRight() | Reduce a Set to a single, accumulated value by applying a provided function to each element. Iterates through elements from right to left. Uses a provided seed as the initial value. | set.forEach() | Run a provided function on each element of a Set. Can perform writes. | set.includes() | Test if the Set includes a provided element. | set.isEmpty() | Test if a Set is empty. | set.last() | Get the last element of a Set. | set.lastWhere() | Get the last element of a Set that matches a provided predicate. | set.map() | Apply a provided function to each element of a Set. Can’t perform writes. | set.nonEmpty() | Test if a Set is not empty. | set.order() | Sort a Set's elements. | set.pageSize() | Set the maximum elements per page in paginated results. | set.paginate() | Convert a Set to an Object with pagination. | set.reduce() | Reduce a Set to a single, accumulated value by applying a provided function to each element. Iterates through elements from left to right. Uses the first element as the initial value. | set.reduceRight() | Reduce a Set to a single, accumulated value by applying a provided function to each element. Iterates through elements from right to left. Uses the first element as the initial value. | set.reverse() | Reverse the order of a Set's elements. | set.take() | Get the first N elements of a Set. | set.toArray() | Convert a Set to an Array. | set.toStream() | Create an event source that tracks changes to documents in a supported Set. | set.toString() | Return the string "[set]". | set.where() | Get the elements of a Set that match a provided predicate. | string.length | Get a String's length. | string.at() | Get the character at a specified index of a String. | string.casefold() | Convert a String to lower case using a specified format. | string.concat() | Concatenate two Strings. | string.endsWith() | Test if a String ends with a provided suffix. | string.includes() | Test if a String includes a provided substring. | string.includesRegex() | Test if a String contains a substring that matches a provided regular expression. | string.indexOf() | Get the index of the first matching substring within a String. | string.indexOfRegex() | Get the index of the first substring matching a provided regular expression within a String. | string.insert() | Insert a substring into a String at a specified index. | string.lastIndexOf() | Get the index of the last matching substring within a String. | string.matches() | Get the substrings in a String that match a provided regular expression. | string.matchIndexes() | Get the indexes and substrings in a String that match a provided regular expression. | string.parseDouble() | Convert a String to a Double. | string.parseInt() | Convert a String to a Int. | string.parseLong() | Convert a String to a Long. | string.parseNumber() | Convert a String to a Number. | string.replace() | Replace a specified number of occurrences of a substring in a String. | string.replaceAll() | Replace all occurrences of a substring in a String. | string.replaceAllRegex() | Replace all occurrences of substrings matching a regular expression in a String. | string.replaceRegex() | Replace a specified number of occurrences of substrings matching a regular expression in a String. | string.slice() | Get the substring between two indexes of a String. | string.split() | Split a String at a provided separator. | string.splitAt() | Split a String at a provided index. | string.splitRegex() | Split a String using a provided regular expression. | string.startsWith() | Test if a String starts with a provided prefix. | string.toLowerCase() | Convert a String to lower case. | string.toString() | Get a String representation of the value. | string.toUpperCase() | Convert a String to upper case. | dayOfMonth | Get the day of the month from a Time. | dayOfWeek | Get the day of the week from a Time. | dayOfYear | Get the day of the year from a Time. | hour | Get the hour of a Time. | minute | Get the minute of a Time. | month | Get the month of a Time. | second | Get the second of a Time. | year | Get the year of a Time. | Time() | Construct a Time from an ISO 8601 timestamp String. | Time.epoch() | Convert a Unix epoch timestamp to a Time. | Time.fromString() | Construct a Time from an ISO 8601 timestamp String. | Time.now() | Get the current UTC Time. | time.add() | Add a time interval to a Time. | time.difference() | Get the difference between two Times. | time.subtract() | Subtract a time interval from a Time. | time.toMicros() | Convert a Time to a Unix epoch timestamp in microseconds. | time.toMillis() | Convert a Time to a Unix epoch timestamp in milliseconds. | time.toSeconds() | Convert a Time to a Unix epoch timestamp in seconds. | time.toString() | Convert a Time to a String. | Token.all() | Get a Set of all tokens. | Token.byDocument() | Get a token by its identity document. | Token.byId() | Get a token by its document id. | Token.create() | Create a token without a credential or related password. | Token.firstWhere() | Get the first token that matches a provided predicate. | Token.toString() | Get "Token" as a String. | Token.where() | Get a Set of tokens that match a provided predicate. | token.delete() | Delete a token. | token.exists() | Test if a token exists. | token.replace() | Replace a token. | token.update() | Update a token. | TransactionTime() | Get the query transaction time. | TransactionTime.toString() | Get "[transaction time]" as a String. | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | AccessProvider.all() | Get a Set of all access providers. | | AccessProvider.byName() | Get an access provider by its name. | | AccessProvider.create() | Create an access provider. | | AccessProvider.firstWhere() | Get the first access provider that matches a provided predicate. | | AccessProvider.toString() | Get "AccessProvider" as a String. | | AccessProvider.where() | Get a Set of access providers that match a provided predicate. | | accessProvider.delete() | Delete an access provider. | | accessProvider.exists() | Test if an access provider exists. | | accessProvider.replace() | Replace an access provider. | | accessProvider.update() | Update an access provider. | | Array.sequence() | Create an ordered Array of Numbers given start and end values. | | array.length | The number of elements in the Array. | | array.aggregate() | Aggregate all elements of an Array. | | array.any() | Test if any element of an Array matches a provided predicate. | | array.append() | Append a provided element to an Array. | | array.at() | Get the Array element at a provided index. | | array.concat() | Concatenate two Arrays. | | array.distinct() | Get the unique elements of an Array. | | array.drop() | Drop the first N elements of an Array. | | array.entries() | Add the index to each element of an Array. | | array.every() | Test if every element of an Array matches a provided predicate. | | array.filter() | Filter an Array using a provided predicate. | | array.first() | Get the first element of an Array. | | array.firstWhere() | Get the first element of an Array that matches a provided predicate. | | array.flatMap() | Apply a provided function to each Array element and flatten the resulting Array by one level. | | array.flatten() | Flatten an Array by one level. | | array.fold() | Reduce the Array to a single, accumulated value by applying a provided function to each element. Iterates through elements from left to right. Uses a provided seed as the initial value. | | array.foldRight() | Reduce an Array to a single, accumulated value by applying a provided function to each element. Iterates through elements from right to left. Uses a provided seed as the initial value. | | array.forEach() | Run a provided function on each element of an Array. Can perform writes. | | array.includes() | Test if the Array includes a provided element. | | array.indexOf() | Get the index of the first Array element that matches a provided value. | | array.indexWhere() | Get the index of the first Array element that matches a provided predicate. | | array.isEmpty() | Test if an Array is empty. | | array.last() | Get the last element of an Array. | | array.lastIndexOf() | Get the index of the last Array element that matches a provided value. | | array.lastIndexWhere() | Get the index of the last Array element that matches a provided predicate. | | array.lastWhere() | Get the last element of an Array that matches a provided predicate. | | array.map() | Apply a provided function to each element of an Array. Can’t perform writes. | | array.nonEmpty() | Test if an Array is not empty. | | array.order() | Sort an Array's elements. | | array.prepend() | Prepend an element to an Array. | | array.reduce() | Reduce an Array to a single, accumulated value by applying a provided function to each element. Iterates through elements from left to right. Uses the first element as the initial value. | | array.reduceRight() | Reduce an Array to a single, accumulated value by applying a provided function to each element. Iterates through elements from right to left. Uses the first element as the initial value. | | array.reverse() | Reverse the order of an Array's elements. | | array.slice() | Get a subset of an Array's elements based on provided indexes. | | array.take() | Get the first N elements of an Array. | | array.toSet() | Convert an Array to a Set. | | array.toString() | Convert an Array to a String. | | array.where() | Get the elements of an Array that match a provided predicate. | | Bytes() | Convert a Base64-encoded string to an FQL Bytes value. | | Bytes.fromBase64() | Convert a Base64-encoded string to an FQL Bytes value. | | bytes.toBase64() | Convert an FQL Bytes value to a Base64-encoded string. | | bytes.toString() | Convert an FQL Bytes value to a Base64-encoded string. | | collection.definition | Get a collection definition, represented as a Collection document with the CollectionDef type. | | Collection() | Access a collection by its name. | | Collection.all() | Get a Set of all collection definitions. | | Collection.byName() | Get a collection definitions by its name. | | Collection.create() | Create a collection. | | Collection.firstWhere() | Get the first collection definition that matches a provided predicate. | | Collection.toString() | Get "Collection" as a String. | | Collection.where() | Get a Set of collection definitions that match a provided predicate. | | collectionDef.delete() | Delete a collection. | | collectionDef.exists() | Test if a collection exists. | | collectionDef.replace() | Replaces a collection definition. | | collectionDef.update() | Update a collection definition. | | collection.all() | Get a Set of all documents in a collection. | | collection.byId() | Get a collection document by its document id. | | collection.create() | Create a collection document. | | collection.firstWhere() | Get the first collection document that matches a provided predicate. | | collection.indexName() | Call an index as a method to get a Set of matching collection documents. | | collection.where() | Get a Set of collection documents that match a provided predicate. | | Credential.all() | Get a Set of all credentials. | | Credential.byDocument() | Get a credential by its identity document. | | Credential.byId() | Get a credential by its document id. | | Credential.create() | Create a credential. | | Credential.firstWhere() | Get the first credential that matches a provided predicate. | | Credential.toString() | Get "Credential" as a String. | | Credential.where() | Get a Set of credentials that match a provided predicate. | | credential.delete() | Delete a credential. | | credential.exists() | Test if a credential exists. | | credential.login() | Create a token for a provided credential and its password. | | credential.replace() | Replace a credential. | | credential.update() | Update a credential. | | credential.verify() | Test whether a provided password is valid for a credential. | | Database.all() | Get a Set of all child databases nested directly under the database. | | Database.byName() | Get a child database by its name. | | Database.create() | Create a child database. | | Database.firstWhere() | Get the first child database document that matches a provided predicate. | | Database.toString() | Get "Database" as a String. | | Database.where() | Get a Set of child databases that match a provided predicate. | | database.delete() | Deletes a child database. | | database.exists() | Test if a child database exists. | | database.replace() | Replace a child database's metadata and settings. | | database.update() | Update a child database's metadata and settings. | | dayOfMonth | Get the day of the month from a Date. | | dayOfWeek | Get the day of the week from a Date. | | dayOfYear | Get the day of the year from a Date. | | month | Get the month of a Date. | | year | Get the year of a Date. | | Date() | Construct a Date from a ISO 8601 date String. | | Date.fromString() | Construct a Date from a date String. | | Date.today() | Get the current UTC Date. | | date.add() | Add number of days to a Date. | | date.difference() | Get the difference between two Dates. | | date.subtract() | Subtract number of days from a Date. | | date.toString() | Convert a Date to a String. | | document.delete() | Delete a collection document. | | document.exists() | Test if a collection document exists. | | document.replace() | Replace all fields in a collection document. | | document.update() | Update a collection document's fields. | | eventSource.map() | Apply an anonymous function to each element of an event source's tracked Set. | | eventSource.toString() | Get "[event source]" as a string. | | eventSource.where() | Create an event source that emits events for a subset of another event source’s tracked Set. | | function.definition | Get or update a user-defined function (UDF)'s definition, represented as a Function document. | | Function() | Call a user-defined function (UDF) by its name. | | Function.all() | Get a Set of all user-defined functions (UDFs). | | Function.byName() | Get a user-defined function (UDF) by its name. | | Function.create() | Create a user-defined function (UDF). | | Function.firstWhere() | Get the first user-defined function (UDF) that matches a provided predicate. | | Function.toString() | Get "Function" as a String. | | Function.where() | Get a Set of user-defined functions (UDFs) that match a provided predicate. | | functionDef.delete() | Delete a user-defined function (UDF). | | functionDef.exists() | Test if a user-defined function (UDF) exists. | | functionDef.replace() | Replace a user-defined function (UDF). | | functionDef.update() | Update a user-defined function (UDF). | | abort() | End the current query and return an abort error with a user-defined abort value. | | dbg() | Output a debug message in the query summary and return the message in the query results. | | ID() | Create a valid ID | | log() | Output a log message in the query summary and return null. | | newId() | Get a unique string-encoded 64-bit integer. | | Key.all() | Get a Set of all keys. | | Key.byId() | Get a key by its document id. | | Key.create() | Create a key. | | Key.firstWhere() | Get the first key that matches a provided predicate. | | Key.toString() | Get "Key" as a String. | | Key.where() | Get a Set of keys that match a provided predicate. | | key.delete() | Delete a key. | | key.exists() | Test if a key exists. | | key.replace() | Replace a key. | | key.update() | Update a key. | | Math.E | Get the Euler’s number mathematical constant (℮). | | Math.Infinity | String value representing infinity. | | Math.NaN | Value representing Not-a-Number. | | Math.PI | Get the mathematical constant pi (π). | | Math.abs() | Get the absolute value of a Number. | | Math.acos() | Get the inverse cosine in radians of a Number. | | Math.asin() | Get the inverse sine in radians of a Number. | | Math.atan() | Get the inverse tangent in radians of a Number. | | Math.ceil() | Round up a Number. | | Math.cos() | Get the cosine of a Number in radians. | | Math.cosh() | Get the hyperbolic cosine of a Number. | | Math.degrees() | Convert radians to degrees. | | Math.exp() | Get the value of ℮ raised to the power of a Number. | | Math.floor() | Round down a Number. | | Math.hypot() | Get the hypotenuse of a right triangle. | | Math.log() | Get the natural logarithm, base e, of a Number. | | Math.log10() | Get the base 10 logarithm of a Number. | | Math.max() | Get the larger of two Numbers. | | Math.mean() | Get the arithmetic mean of an Array or Set of Numbers. | | Math.min() | Get the smaller of the input parameter Numbers. | | Math.pow() | Get the value of a base raised to a power. | | Math.radians() | Convert the value of a Number in degrees to radians. | | Math.round() | Get the value of a Number rounded to the nearest integer. | | Math.sign() | Get the sign of a Number. | | Math.sin() | Get the sine of a Number in radians. | | Math.sinh() | Get the hyperbolic sine of a Number. | | Math.sqrt() | Get the square root of a Number. | | Math.sum() | Get the sum of an Array or Set of Numbers. | | Math.tan() | Get the tangent of a Number in radians. | | Math.tanh() | Get the hyperbolic tangent of a Number. | | Math.trunc() | Truncate a Number to a given precision. | | Object.assign() | Copies properties from a source Object to a destination Object. | | Object.entries() | Convert an Object to an Array of key-value pairs. | | Object.fromEntries() | Convert an Array of key-value pairs to an Object. | | Object.hasPath() | Test if an Object has a property. | | Object.keys() | Get an Object's top-level property keys as an Array. | | Object.select() | Get an Object property’s value by its path. | | Object.toString() | Convert an Object to a String. | | Object.values() | Get an Object's property values as an Array. | | Query.identity() | Get the identity document for the query’s authentication token. | | Query.isEnvProtected() | Test if the queried database is in protected mode. | | Query.isEnvTypechecked() | Test if the queried database is typechecked. | | Query.token() | Get the Token document or JWT payload for the query’s authentication secret. | | Role.all() | Get a Set of all user-defined roles. | | Role.byName() | Get a user-defined role by its name. | | Role.create() | Create a user-defined role. | | Role.firstWhere() | Get the first user-defined role matching a provided predicate. | | Role.toString() | Get "Role" as a String. | | Role.where() | Get a Set of user-defined roles that match a provided predicate. | | role.delete() | Delete a user-defined role. | | role.exists() | Test if a user-defined role exists. | | role.replace() | Replace a user-defined role. | | role.update() | Update a user-defined role. | | FQL.Schema.defForIdentifier() | Returns the definition for a user-defined collection or user-defined function (UDF) using the same rules as top-level identifier lookups. | | Set.paginate() | Get a page of paginated results using an after cursor. | | Set.sequence() | Create an ordered Set of Numbers given start and end values. | | Set.single() | Create a Set containing a single provided element. | | set.aggregate() | Aggregate all elements of a Set. | | set.any() | Test if any element of a Set matches a provided predicate. | | set.changesOn() | Create an event source that tracks changes to specified document fields in a supported Set. | | set.concat() | Concatenate two Sets. | | set.count() | Get the number of elements in a Set. | | set.distinct() | Get the unique elements of a Set. | | set.drop() | Drop the first N elements of a Set. | | set.eventsOn() | Create an event source that tracks changes to specified document fields in a supported Set. | | set.eventSource() | Create an event source that tracks changes to documents in a supported Set. | | set.every() | Test if every element of a Set matches a provided predicate. | | set.first() | Get the first element of a Set. | | set.firstWhere() | Get the first element of a Set that matches a provided predicate. | | set.flatMap() | Apply a provided function to each Set element and flatten the resulting Set by one level. | | set.fold() | Reduce the Set to a single, accumulated value by applying a provided function to each element. Iterates through elements from left to right. Uses a provided seed as the initial value. | | set.foldRight() | Reduce a Set to a single, accumulated value by applying a provided function to each element. Iterates through elements from right to left. Uses a provided seed as the initial value. | | set.forEach() | Run a provided function on each element of a Set. Can perform writes. | | set.includes() | Test if the Set includes a provided element. | | set.isEmpty() | Test if a Set is empty. | | set.last() | Get the last element of a Set. | | set.lastWhere() | Get the last element of a Set that matches a provided predicate. | | set.map() | Apply a provided function to each element of a Set. Can’t perform writes. | | set.nonEmpty() | Test if a Set is not empty. | | set.order() | Sort a Set's elements. | | set.pageSize() | Set the maximum elements per page in paginated results. | | set.paginate() | Convert a Set to an Object with pagination. | | set.reduce() | Reduce a Set to a single, accumulated value by applying a provided function to each element. Iterates through elements from left to right. Uses the first element as the initial value. | | set.reduceRight() | Reduce a Set to a single, accumulated value by applying a provided function to each element. Iterates through elements from right to left. Uses the first element as the initial value. | | set.reverse() | Reverse the order of a Set's elements. | | set.take() | Get the first N elements of a Set. | | set.toArray() | Convert a Set to an Array. | | set.toStream() | Create an event source that tracks changes to documents in a supported Set. | | set.toString() | Return the string "[set]". | | set.where() | Get the elements of a Set that match a provided predicate. | | string.length | Get a String's length. | | string.at() | Get the character at a specified index of a String. | | string.casefold() | Convert a String to lower case using a specified format. | | string.concat() | Concatenate two Strings. | | string.endsWith() | Test if a String ends with a provided suffix. | | string.includes() | Test if a String includes a provided substring. | | string.includesRegex() | Test if a String contains a substring that matches a provided regular expression. | | string.indexOf() | Get the index of the first matching substring within a String. | | string.indexOfRegex() | Get the index of the first substring matching a provided regular expression within a String. | | string.insert() | Insert a substring into a String at a specified index. | | string.lastIndexOf() | Get the index of the last matching substring within a String. | | string.matches() | Get the substrings in a String that match a provided regular expression. | | string.matchIndexes() | Get the indexes and substrings in a String that match a provided regular expression. | | string.parseDouble() | Convert a String to a Double. | | string.parseInt() | Convert a String to a Int. | | string.parseLong() | Convert a String to a Long. | | string.parseNumber() | Convert a String to a Number. | | string.replace() | Replace a specified number of occurrences of a substring in a String. | | string.replaceAll() | Replace all occurrences of a substring in a String. | | string.replaceAllRegex() | Replace all occurrences of substrings matching a regular expression in a String. | | string.replaceRegex() | Replace a specified number of occurrences of substrings matching a regular expression in a String. | | string.slice() | Get the substring between two indexes of a String. | | string.split() | Split a String at a provided separator. | | string.splitAt() | Split a String at a provided index. | | string.splitRegex() | Split a String using a provided regular expression. | | string.startsWith() | Test if a String starts with a provided prefix. | | string.toLowerCase() | Convert a String to lower case. | | string.toString() | Get a String representation of the value. | | string.toUpperCase() | Convert a String to upper case. | | dayOfMonth | Get the day of the month from a Time. | | dayOfWeek | Get the day of the week from a Time. | | dayOfYear | Get the day of the year from a Time. | | hour | Get the hour of a Time. | | minute | Get the minute of a Time. | | month | Get the month of a Time. | | second | Get the second of a Time. | | year | Get the year of a Time. | | Time() | Construct a Time from an ISO 8601 timestamp String. | | Time.epoch() | Convert a Unix epoch timestamp to a Time. | | Time.fromString() | Construct a Time from an ISO 8601 timestamp String. | | Time.now() | Get the current UTC Time. | | time.add() | Add a time interval to a Time. | | time.difference() | Get the difference between two Times. | | time.subtract() | Subtract a time interval from a Time. | | time.toMicros() | Convert a Time to a Unix epoch timestamp in microseconds. | | time.toMillis() | Convert a Time to a Unix epoch timestamp in milliseconds. | | time.toSeconds() | Convert a Time to a Unix epoch timestamp in seconds. | | time.toString() | Convert a Time to a String. | | Token.all() | Get a Set of all tokens. | | Token.byDocument() | Get a token by its identity document. | | Token.byId() | Get a token by its document id. | | Token.create() | Create a token without a credential or related password. | | Token.firstWhere() | Get the first token that matches a provided predicate. | | Token.toString() | Get "Token" as a String. | | Token.where() | Get a Set of tokens that match a provided predicate. | | token.delete() | Delete a token. | | token.exists() | Test if a token exists. | | token.replace() | Replace a token. | | token.update() | Update a token. | | TransactionTime() | Get the query transaction time. | | TransactionTime.toString() | Get "[transaction time]" as a String. | # AccessProvider | Learn: Access providers | | --- | --- | --- | An [access provider](../../../learn/security/access-providers/) registers an external identity provider (IdP), such as Auth0, in your Fauna database. Once [set up](../../../learn/security/access-providers/#config), the IdP can issue JSON Web Tokens (JWTs) that act as Fauna [authentication secrets](../../../learn/security/authentication/#secrets). This lets your application’s end users use the IdP for authentication. ## [](#collection)`AccessProvider` collection Fauna stores access providers as documents in the `AccessProvider` system collection. These documents are an FQL version of the FSL [access provider schema](../../fsl/access-provider/). `AccessProvider` documents have the following FQL structure: ```fql { name: "someIssuer", coll: AccessProvider, ts: Time("2099-09-06T21:46:50.272Z"), issuer: "https://example.com/", jwks_uri: "https://example.com/.well-known/jwks.json", roles: [ "customer", { role: "manager", predicate: "(jwt) => jwt!.scope.includes(\"manager\")" } ], data: { desc: "Access provider for issuer" }, audience: "https://db.fauna.com/db/ysij4khxoynr4" } ``` | Field | Type | Read-only | Required | Description | | --- | --- | --- | --- | --- | --- | --- | | name | String | | true | Unique name for the access provider in the database.Must begin with a letter. Can only include letters, numbers, and underscores. | | coll | Collection | true | | Collection name: AccessProvider. | | ts | Time | true | | Last time the document was created or updated. | | issuer | String | | true | Issuer for the IdP’s JWTs. Must match the iss claim in JWTs issued by the IdP. | | jwks_uri | String | | true | URI that points to public JSON web key sets (JWKS) for JWTs issued by the IdP. Fauna uses the keys to verify each JWT’s signature. | | roles | String |Object |Array of strings and objects |Null | | | User-defined roles assigned to JWTs issued by the IdP. Can’t be built-in roles.A roles string is the name of a user-defined role. roles objects have the following schema: FieldTypeDescriptionroleStringName of a user-defined role.predicateStringFQL predicate function. If present, JWTs are only assigned the role if the predicate evaluates to true.The predicate function is passed one argument: an object containing the JWT’s payload. The predicate function does not support shorthand syntax. | Field | Type | Description | role | String | Name of a user-defined role. | predicate | String | FQL predicate function. If present, JWTs are only assigned the role if the predicate evaluates to true.The predicate function is passed one argument: an object containing the JWT’s payload. The predicate function does not support shorthand syntax. | | Field | Type | Description | | role | String | Name of a user-defined role. | | predicate | String | FQL predicate function. If present, JWTs are only assigned the role if the predicate evaluates to true.The predicate function is passed one argument: an object containing the JWT’s payload. The predicate function does not support shorthand syntax. | | data | { *: Any } | Null | | | Arbitrary user-defined metadata for the document. | | audience | String | true | | Globally unique URL for the Fauna database. audience URLs have the following structure:https://db.fauna.com/db/ where is the globally unique ID for the database.Must match the aud claim in JWTs issued by the IdP. | ## [](#static-methods)Static methods You can use the following static methods to manage the `AccessProvider` collection in FQL. | Method | Description | | --- | --- | --- | --- | | AccessProvider.all() | Get a Set of all access providers. | | AccessProvider.byName() | Get an access provider by its name. | | AccessProvider.create() | Create an access provider. | | AccessProvider.firstWhere() | Get the first access provider that matches a provided predicate. | | AccessProvider.toString() | Get "AccessProvider" as a String. | | AccessProvider.where() | Get a Set of access providers that match a provided predicate. | ## [](#instance-methods)Instance methods You can use the following instance methods to manage specific `AccessProvider` documents in FQL. | Method | Description | | --- | --- | --- | --- | | accessProvider.delete() | Delete an access provider. | | accessProvider.exists() | Test if an access provider exists. | | accessProvider.replace() | Replace an access provider. | | accessProvider.update() | Update an access provider. | # `AccessProvider.all()` | Learn: Access providers | | --- | --- | --- | Get a Set of all [access providers](../../../../learn/security/access-providers/). ## [](#signature)Signature ```fql-sig AccessProvider.all() => Set AccessProvider.all(range: { from: Any } | { to: Any } | { from: Any, to: Any }) => Set ``` ## [](#description)Description Gets a Set containing all [access providers](../../../../learn/security/access-providers/), represented as [`AccessProvider` documents](../), for the database. To limit the returned Set, you can provide an optional range. `AccessProvider` documents are FQL versions of a database’s FSL [access provider schema](../../../fsl/access-provider/). `AccessProvider` documents have the [AccessProvider](../../../fql/types/#accessprovider) type. See [Access providers](../../../../learn/security/access-providers/). If this method is the last expression in a query, the first page of the Set is returned. See [Pagination](../../../../learn/query/pagination/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | range | { from: Any } | { to: Any } | { from: Any, to: Any } | | Specifies a range of AccessProvider documents in the form { from: start, to: end }. from and to arguments should be in the order returned by an unbounded AccessProvider.all() call. See Range examples.The Set only includes documents in this range (inclusive). Omit from or to to run unbounded range searches.If a range is omitted, all access providers are returned. | ### [](#range-parameters)Range parameters | Name | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | from | Any | | Beginning of the range (inclusive). Must be an AccessProvider document. | | to | Any | | End of the range (inclusive). Must be an AccessProvider document. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set of AccessProvider documents in the provided range. If a range is omitted, all access providers are returned.The Set is empty if:The database has no access providers.There are no access providers in the provided range.The provided range’s from value is greater than to. | ## [](#examples)Examples ### [](#range)Range examples 1. Get all access providers for the database: ```fql AccessProvider.all() ``` ``` { data: [ { name: "issuerFoo", coll: AccessProvider, ts: Time("2099-06-25T14:57:23.125Z"), ... issuer: "https://example.com/a", audience: "https://db.fauna.com/db/ysjowue14yyr1", jwks_uri: "https://example.com/a/.well-known/jwks.json" }, { name: "issuerBaz", coll: AccessProvider, ts: Time("2099-06-25T14:57:23.125Z"), ... issuer: "https://example.com/b", audience: "https://db.fauna.com/db/ysjowue14yyr1", jwks_uri: "https://example.com/b/.well-known/jwks.json" }, { name: "issuerBar", coll: AccessProvider, ts: Time("2099-06-25T14:57:23.125Z"), ... issuer: "https://example.com/c", audience: "https://db.fauna.com/db/ysjowue14yyr1", jwks_uri: "https://example.com/c/.well-known/jwks.json" }, { name: "issuerQux", coll: AccessProvider, ts: Time("2099-06-25T14:57:23.125Z"), ... issuer: "https://example.com/d", audience: "https://db.fauna.com/db/ysjowue14yyr1", jwks_uri: "https://example.com/d/.well-known/jwks.json" }, ... ] } ``` 2. Given the previous Set, get all access providers starting with `issuerBaz` (inclusive): ```fql AccessProvider.all({ from: AccessProvider.byName("issuerBaz") }) ``` ``` { data: [ { name: "issuerBaz", coll: AccessProvider, ts: Time("2099-06-25T14:57:23.125Z"), ... issuer: "https://example.com/b", audience: "https://db.fauna.com/db/ysjowue14yyr1", jwks_uri: "https://example.com/b/.well-known/jwks.json" }, { name: "issuerBar", coll: AccessProvider, ts: Time("2099-06-25T14:57:23.125Z"), ... issuer: "https://example.com/c", audience: "https://db.fauna.com/db/ysjowue14yyr1", jwks_uri: "https://example.com/c/.well-known/jwks.json" }, { name: "issuerQux", coll: AccessProvider, ts: Time("2099-06-25T14:57:23.125Z"), ... issuer: "https://example.com/d", audience: "https://db.fauna.com/db/ysjowue14yyr1", jwks_uri: "https://example.com/d/.well-known/jwks.json" }, ... ] } ``` 3. Get a Set of access providers from `issuerBaz` (inclusive) to `issuerBar` (inclusive): ```fql AccessProvider.all({ from: AccessProvider.byName("issuerBaz"), to: AccessProvider.byName("issuerBar") }) ``` ``` { data: [ { name: "issuerBaz", coll: AccessProvider, ts: Time("2099-06-25T14:57:23.125Z"), ... issuer: "https://example.com/b", audience: "https://db.fauna.com/db/ysjowue14yyr1", jwks_uri: "https://example.com/b/.well-known/jwks.json" }, { name: "issuerBar", coll: AccessProvider, ts: Time("2099-06-25T14:57:23.125Z"), ... issuer: "https://example.com/bc", audience: "https://db.fauna.com/db/ysjowue14yyr1", jwks_uri: "https://example.com/c/.well-known/jwks.json" } ] } ``` 4. Get a Set of access providers up to `issuerBar` (inclusive): ```fql AccessProvider.all({ to: AccessProvider.byName("issuerBar") }) ``` ``` { data: [ { name: "issuerFoo", coll: AccessProvider, ts: Time("2099-06-25T14:57:23.125Z"), ... issuer: "https://example.com/a", audience: "https://db.fauna.com/db/ysjowue14yyr1", jwks_uri: "https://example.com/a/.well-known/jwks.json" }, { name: "issuerBaz", coll: AccessProvider, ts: Time("2099-06-25T14:57:23.125Z"), ... issuer: "https://example.com/b", audience: "https://db.fauna.com/db/ysjowue14yyr1", jwks_uri: "https://example.com/b/.well-known/jwks.json" }, { name: "issuerBar", coll: AccessProvider, ts: Time("2099-06-25T14:57:23.125Z"), ... issuer: "https://example.com/c", audience: "https://db.fauna.com/db/ysjowue14yyr1", jwks_uri: "https://example.com/c/.well-known/jwks.json" } ] } ``` # `AccessProvider.byName()` | Learn: Access providers | | --- | --- | --- | Get an [access provider](../../../../learn/security/access-providers/) by its name. ## [](#signature)Signature ```fql-sig AccessProvider.byName(name: String) => NamedRef ``` ## [](#description)Description Gets an [access provider](../../../../learn/security/access-providers/), represented as an [`AccessProvider` document](../), by its name. `AccessProvider` documents are FQL versions of a database’s FSL [access provider schema](../../../fsl/access-provider/). `AccessProvider` documents have the [AccessProvider](../../../fql/types/#accessprovider) type. See [Access providers](../../../../learn/security/access-providers/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | name | String | true | name of the AccessProvider document to retrieve. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | NamedRef | Resolved reference to the AccessProvider document. Can resolve to an existing document or a NullDoc. | ## [](#examples)Examples ```fql AccessProvider.byName("someIssuer") ``` ``` { name: "someIssuer", coll: AccessProvider, ts: Time("2099-06-25T14:57:23.125Z"), roles: [ "customer", { role: "manager", predicate: "(jwt) => jwt!.scope.includes(\"manager\")" } ], issuer: "https://example.com/", jwks_uri: "https://example.com/.well-known/jwks.json", audience: "https://db.fauna.com/db/ysjowue14yyr1" } ``` # `AccessProvider.create()` | Learn: Access providers | | --- | --- | --- | Create an [access provider](../../../../learn/security/access-providers/). ## [](#signature)Signature ```fql-sig AccessProvider.create(data: { name: String, issuer: String, jwks_uri: String, roles: String | { role: String, predicate: String } | Array | Null, data: { *: Any } | Null }) => AccessProvider ``` ## [](#description)Description Creates an [access provider](../../../../learn/security/access-providers/) with the provided document fields. Fauna stores access providers as documents in the [`AccessProvider` system collection](../). `AccessProvider` documents are FQL versions of a database’s FSL [access provider schema](../../../fsl/access-provider/). `AccessProvider` documents have the [AccessProvider](../../../fql/types/#accessprovider) type. See [Access providers](../../../../learn/security/access-providers/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method adds an access provider to the staged schema, not the active schema. If the database has no staged schema, using this method is equivalent to making an [unstaged schema change](../../../../learn/schema/manage-schema/#unstaged). Changes are applied immediately to the database’s active schema. #### [](#concurrent)Avoid concurrent schema changes Concurrent [unstaged schema changes](../../../../learn/schema/manage-schema/#unstaged) can cause [contended transactions](../../../../learn/transactions/contention/), even if the changes affect different resources. This includes unstaged changes made using: * The [Fauna CLI](../../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../../learn/schema/manage-schema/#fql) A schema change triggers a [transaction that validates the entire database schema](../../../../learn/schema/#validation). To avoid errors, do one of the following instead: * Run [staged schema changes](../../../../learn/schema/manage-schema/#staged) * Perform unstaged schema changes sequentially ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | | Document fields for the new AccessProvider document.For supported document fields, see AccessProvider collection. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | AccessProvider | The new AccessProvider document. | ## [](#examples)Examples ```fql AccessProvider.create({ name: "someIssuer", issuer: "https://example.com/", jwks_uri: "https://example.com/.well-known/jwks.json", roles: [ "customer", { role: "manager", predicate: "(jwt) => jwt!.scope.includes(\"manager\")" } ], data: { desc: "Access provider for issuer" } }) ``` ``` { name: "someIssuer", coll: AccessProvider, ts: Time("2099-06-25T13:08:04.020Z"), issuer: "https://example.com/", audience: "https://db.fauna.com/db/ysjons5xryyr4", data: { desc: "Access provider for issuer" }, jwks_uri: "https://example.com/.well-known/jwks.json", roles: [ "customer", { role: "manager", predicate: "(jwt) => jwt!.scope.includes(\"manager\")" } ] } ``` # `AccessProvider.firstWhere()` | Learn: Access providers | | --- | --- | --- | Get the first [access provider](../../../../learn/security/access-providers/) that matches a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig AccessProvider.firstWhere(pred: (AccessProvider => Boolean)) => AccessProvider | Null ``` ## [](#description)Description Gets the first [access provider](../../../../learn/security/access-providers/), represented as an [`AccessProvider` document](../), that matches a provided [predicate function](../../../fql/functions/#predicates). `AccessProvider` documents are FQL versions of a database’s FSL [access provider schema](../../../fsl/access-provider/). `AccessProvider` documents have the [AccessProvider](../../../fql/types/#accessprovider) type. See [Access providers](../../../../learn/security/access-providers/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pred | Predicate function | Yes | Anonymous predicate function that:Accepts an AccessProvider document as its only argument. Supports shorthand-syntax.Returns a Boolean value.The method returns the first AccessProvider document for which the predicate returns true. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | AccessProvider | First AccessProvider document that matches the predicate. | | Null | No AccessProvider document matches the predicate. | ## [](#examples)Examples ```fql AccessProvider.firstWhere(.issuer == "https://example.com/") ``` ``` { name: "someIssuer", coll: AccessProvider, ts: Time("2099-06-25T14:57:23.125Z"), jwks_uri: "https://example.com/.well-known/jwks.json", roles: [ "customer", { role: "manager", predicate: "(jwt) => jwt!.scope.includes(\"manager\")" } ], audience: "https://db.fauna.com/db/ysjowue14yyr1", issuer: "https://example.com/" } ``` # `AccessProvider.toString()` | Learn: Access providers | | --- | --- | --- | Get `"AccessProvider"` as a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig AccessProvider.toString() => String ``` ## [](#description)Description Returns the name of the [`AccessProvider` collection](../) as a [String](../../../fql/types/#string). ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | "AccessProvider" | ## [](#examples)Examples ```fql AccessProvider.toString() ``` ``` "AccessProvider" ``` # `AccessProvider.where()` | Learn: Access providers | | --- | --- | --- | Get a Set of [access providers](../../../../learn/security/access-providers/) that match a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig AccessProvider.where(pred: (AccessProvider => Boolean)) => Set ``` ## [](#description)Description Gets a Set of [access providers](../../../../learn/security/access-providers/), represented as [`AccessProvider` documents](../), that match a provided [predicate function](../../../fql/functions/#predicates). `AccessProvider` documents are FQL versions of a database’s FSL [access provider schema](../../../fsl/access-provider/). `AccessProvider` documents have the [AccessProvider](../../../fql/types/#accessprovider) type. See [Access providers](../../../../learn/security/access-providers/). If `AccessProvider.where()` is the last expression in a query, the first page of the `Set` is returned. See [Pagination](../../../../learn/query/pagination/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pred | Predicate function | Yes | Anonymous predicate function that:Accepts an AccessProvider document as its only argument. Supports shorthand-syntax.Returns a Boolean value.The method returns a Set of AccessProvider documents for which the predicate returns true. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set of AccessProvider documents that match the predicate. If there are no matching documents, the Set is empty. | ## [](#examples)Examples ```fql AccessProvider.where(.issuer.includes("example.com")) ``` ``` { data: [ { name: "someIssuer", coll: AccessProvider, ts: Time("2099-06-25T13:15:14.165Z"), jwks_uri: "https://example.com/.well-known/jwks.json", issuer: "https://example.com/", audience: "https://db.fauna.com/db/ysjons5xryyr4", roles: [ "customer", { role: "manager", predicate: "(jwt) => jwt!.scope.includes(\"manager\")" } ], }, ... ] } ``` # `accessProvider.delete()` | Learn: Access providers | | --- | --- | --- | Delete an [access provider](../../../../learn/security/access-providers/). ## [](#signature)Signature ```fql-sig delete() => NullAccessProvider ``` ## [](#description)Description Deletes an [access provider](../../../../learn/security/access-providers/), represented as a [`AccessProvider` document](../). `AccessProvider` documents are FQL versions of a database’s FSL [access provider schema](../../../fsl/access-provider/). `AccessProvider` documents have the [AccessProvider](../../../fql/types/#accessprovider) type. See [Access providers](../../../../learn/security/access-providers/). ### [](#staged-schema)Staged schema You can’t delete an access provider while a database has [staged schema](../../../../learn/schema/manage-schema/#staged). If the database has no staged schema, using this method is equivalent to making an [unstaged schema change](../../../../learn/schema/manage-schema/#unstaged). Changes are applied immediately to the database’s active schema. #### [](#concurrent)Avoid concurrent schema changes Concurrent [unstaged schema changes](../../../../learn/schema/manage-schema/#unstaged) can cause [contended transactions](../../../../learn/transactions/contention/), even if the changes affect different resources. This includes unstaged changes made using: * The [Fauna CLI](../../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../../learn/schema/manage-schema/#fql) A schema change triggers a [transaction that validates the entire database schema](../../../../learn/schema/#validation). To avoid errors, do one of the following instead: * Run [staged schema changes](../../../../learn/schema/manage-schema/#staged) * Perform unstaged schema changes sequentially ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | NullAccessProvider | Document doesn’t exist. See NullDoc. | ## [](#examples)Examples ```fql AccessProvider.byName("someIssuer")?.delete() ``` ``` AccessProvider.byName("someIssuer") /* deleted */ ``` # `accessProvider.exists()` | Learn: Access providers | | --- | --- | --- | Test if an [access provider](../../../../learn/security/access-providers/) exists. ## [](#signature)Signature ```fql-sig exists() => Boolean ``` ## [](#description)Description Tests if an [access provider](../../../../learn/security/access-providers/), represented as an [`AccessProvider` document](../), exists. `AccessProvider` documents are FQL versions of a database’s FSL [access provider schema](../../../fsl/access-provider/). `AccessProvider` documents have the [AccessProvider](../../../fql/types/#accessprovider) type. See [Access providers](../../../../learn/security/access-providers/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. ### [](#exists-vs-null-comparisons)`exists()` vs. null comparisons You can use either `exists()` or a null comparison (`== null` or `!= null`) to check the existence or validity of a value. For example: ```fql AccessProvider.byName("someIssuer").exists() // true AccessProvider.byName("someIssuer") != null // true ``` Key differences: * `exists()` returns an error if called on an unsupported value. * Null comparisons do not throw errors and work safely on any value. For example: ```fql // Declare an object. Objects don't support // an `exists()` method. let object = { a: "Foo", b: "Bar" } object.exists() // Returns `invalid_query` error object != null // Returns true ``` ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the AccessProvider document exists. If false, the AccessProvider document doesn’t exist. | ## [](#examples)Examples ```fql AccessProvider.byName("someIssuer").exists() ``` ``` true ``` # `accessProvider.replace()` | Learn: Access providers | | --- | --- | --- | Replace an [access provider](../../../../learn/security/access-providers/). ## [](#signature)Signature ```fql-sig replace(data: { *: Any }) => AccessProvider ``` ## [](#description)Description Replaces all fields in an [access provider](../../../../learn/security/access-providers/), represented as an [`AccessProvider` document](../), with fields from a provided data object. Fields not present in the data object, excluding the `coll` and `ts` metadata fields, are removed. `AccessProvider` documents are FQL versions of a database’s FSL [access provider schema](../../../fsl/access-provider/). `AccessProvider` documents have the [AccessProvider](../../../fql/types/#accessprovider) type. See [Access providers](../../../../learn/security/access-providers/). ### [](#metadata-fields)Metadata fields You can’t use this method to replace the following [metadata fields](../../../../learn/data-model/documents/#meta): * `coll` * `ts` ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. You can’t rename an access provider while a database has staged schema. If the database has no staged schema, using this method is equivalent to making an [unstaged schema change](../../../../learn/schema/manage-schema/#unstaged). Changes are applied immediately to the database’s active schema. #### [](#concurrent)Avoid concurrent schema changes Concurrent [unstaged schema changes](../../../../learn/schema/manage-schema/#unstaged) can cause [contended transactions](../../../../learn/transactions/contention/), even if the changes affect different resources. This includes unstaged changes made using: * The [Fauna CLI](../../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../../learn/schema/manage-schema/#fql) A schema change triggers a [transaction that validates the entire database schema](../../../../learn/schema/#validation). To avoid errors, do one of the following instead: * Run [staged schema changes](../../../../learn/schema/manage-schema/#staged) * Perform unstaged schema changes sequentially ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | | Fields for the AccessProvider document. Fields not present, excluding the coll and ts metadata fields, in the object are removed.For supported document fields, see AccessProvider collection.The object can’t include the following metadata fields:collts | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | AccessProvider | AccessProvider document with replaced fields. | ## [](#examples)Examples ```fql AccessProvider.byName("someIssuer")?.replace({ name: "someIssuer", issuer: "https://example.com/", roles: "customer", jwks_uri: "https://example.com/.well-known/jwks.json" }) ``` ``` { name: "someIssuer", coll: AccessProvider, ts: Time("2099-06-25T15:00:07.450Z"), audience: "https://db.fauna.com/db/ysjowue14yyr1", issuer: "https://example.com/", roles: "customer", jwks_uri: "https://example.com/.well-known/jwks.json" } ``` # `accessProvider.update()` | Learn: Access providers | | --- | --- | --- | Update an [access provider](../../../../learn/security/access-providers/). ## [](#signature)Signature ```fql-sig update(data: { *: Any }) => AccessProvider ``` ## [](#description)Description Updates an [access provider](../../../../learn/security/access-providers/), represented as an [`AccessProvider` document](../), with fields from a provided data object. During the update, fields from the data object are copied to the document, creating new fields or updating existing fields. The operation is similar to a merge. `AccessProvider` documents are FQL versions of a database’s FSL [access provider schema](../../../fsl/access-provider/). `AccessProvider` documents have the [AccessProvider](../../../fql/types/#accessprovider) type. See [Access providers](../../../../learn/security/access-providers/). ### [](#nested-fields)Nested fields Fields with nested objects in the data object are merged with the identically named nested object in the document. ### [](#remove-a-field)Remove a field To remove a document field, set its value in the data object to `null`. ### [](#metadata-fields)Metadata fields You can’t use this method to insert or edit the following [metadata fields](../../../../learn/data-model/documents/#meta): * `coll` * `ts` ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. You can’t rename an access provider while a database has staged schema. If the database has no staged schema, using this method is equivalent to making an [unstaged schema change](../../../../learn/schema/manage-schema/#unstaged). Changes are applied immediately to the database’s active schema. #### [](#concurrent)Avoid concurrent schema changes Concurrent [unstaged schema changes](../../../../learn/schema/manage-schema/#unstaged) can cause [contended transactions](../../../../learn/transactions/contention/), even if the changes affect different resources. This includes unstaged changes made using: * The [Fauna CLI](../../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../../learn/schema/manage-schema/#fql) A schema change triggers a [transaction that validates the entire database schema](../../../../learn/schema/#validation). To avoid errors, do one of the following instead: * Run [staged schema changes](../../../../learn/schema/manage-schema/#staged) * Perform unstaged schema changes sequentially ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | | Document fields for the AccessProvider document.For supported document fields, see AccessProvider collection.The object can’t include the following metadata fields:* coll * ts | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | AccessProvider | The updated AccessProvider document. | ## [](#examples)Examples ```fql AccessProvider.byName("someIssuer")?.update({ name: "someIssuer", issuer: "https://example.com/", roles: "customer" }) ``` ``` { name: "someIssuer", coll: AccessProvider, ts: Time("2099-06-25T15:00:27.295Z"), jwks_uri: "https://example.com/.well-known/jwks.json", roles: "customer", issuer: "https://example.com/", audience: "https://db.fauna.com/db/ysjowue14yyr1" } ``` # Array [Array](../../fql/types/#array) methods and properties. ## [](#description)Description [Array](../../fql/types/#array) methods are provided for working with multiple items grouped under a single identifier. Arrays are immutable. ## [](#static-methods)Static methods | Method | Description | | --- | --- | --- | --- | | Array.sequence() | Create an ordered Array of integers, given start and end values. | ## [](#instance-properties)Instance properties | Property | Description | | --- | --- | --- | --- | | array.length | Get the length of an Array. | ## [](#instance-methods)Instance methods | Method | Description | | --- | --- | --- | --- | | array.aggregate() | Aggregate all elements of an Array. | | array.any() | Test if any element of an Array matches a provided predicate. | | array.append() | Append a provided element to an Array. | | array.at() | Get the Array element at a provided index. | | array.concat() | Concatenate two Arrays. | | array.distinct() | Get the unique elements of an Array. | | array.drop() | Drop the first N elements of an Array. | | array.entries() | Add the index to each element of an Array. | | array.every() | Test if every element of an Array matches a provided predicate. | | array.filter() | Filter an Array using a provided predicate. | | array.first() | Get the first element of an Array. | | array.firstWhere() | Get the first element of an Array that matches a provided predicate. | | array.flatMap() | Apply a provided function to each Array element and flatten the resulting Array by one level. | | array.flatten() | Flatten an Array by one level. | | array.fold() | Reduce the Array to a single, accumulated value by applying a provided function to each element. Iterates through elements from left to right. Uses a provided seed as the initial value. | | array.foldRight() | Reduce an Array to a single, accumulated value by applying a provided function to each element. Iterates through elements from right to left. Uses a provided seed as the initial value. | | array.forEach() | Run a provided function on each element of an Array. Can perform writes. | | array.includes() | Test if the Array includes a provided element. | | array.indexOf() | Get the index of the first Array element that matches a provided value. | | array.indexWhere() | Get the index of the first Array element that matches a provided predicate. | | array.isEmpty() | Test if an Array is empty. | | array.last() | Get the last element of an Array. | | array.lastIndexOf() | Get the index of the last Array element that matches a provided value. | | array.lastIndexWhere() | Get the index of the last Array element that matches a provided predicate. | | array.lastWhere() | Get the last element of an Array that matches a provided predicate. | | array.map() | Apply a provided function to each element of an Array. Can’t perform writes. | | array.nonEmpty() | Test if an Array is not empty. | | array.order() | Sort an Array's elements. | | array.prepend() | Prepend an element to an Array. | | array.reduce() | Reduce an Array to a single, accumulated value by applying a provided function to each element. Iterates through elements from left to right. Uses the first element as the initial value. | | array.reduceRight() | Reduce an Array to a single, accumulated value by applying a provided function to each element. Iterates through elements from right to left. Uses the first element as the initial value. | | array.reverse() | Reverse the order of an Array's elements. | | array.slice() | Get a subset of an Array's elements based on provided indexes. | | array.take() | Get the first N elements of an Array. | | array.toSet() | Convert an Array to a Set. | | array.toString() | Convert an Array to a String. | | array.where() | Get the elements of an Array that match a provided predicate. | # `Array.sequence()` Create an ordered [Array](../../../fql/types/#array) of [Number](../../../fql/types/#number)s given start and end values. ## [](#signature)Signature ```fql-sig Array.sequence(from: Number, until: Number) => Array ``` ## [](#description)Description Create an ordered [Array](../../../fql/types/#array) of integers beginning with provided start and end integers. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | from | Number | true | Start of the sequence (inclusive). Must be an Int. | | until | Number | true | End of the sequence (exclusive). Must be an Int. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Array of ordered integers. | ## [](#examples)Examples ### [](#basic-example)Basic example ```fql Array.sequence(1, 5) ``` ``` [ 1, 2, 3, 4 ] ``` ### [](#equal-start-and-end-values)Equal start and end values If the start and end values are equal, `Array.sequence()` produces an empty Array (`[]`). ```fql Array.sequence(1, 1) ``` ``` [] ``` # `array.length` The number of elements in the [Array](../../../fql/types/#array). ## [](#signature)Signature ```fql-sig length: Number ``` ## [](#description)Description The number of elements in an [Array](../../../fql/types/#array). The length of an empty Array is `0`. ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Number of elements in the Array. | ## [](#examples)Examples ```fql [0, 1, 2, 3].length ``` ``` 4 ``` # `array.aggregate()` Aggregate all elements of an [Array](../../../fql/types/#array). ## [](#signature)Signature ```fql-sig aggregate(seed: A, combiner: (A, A) => A) => A ``` ## [](#description)Description Aggregates all elements of the calling [Array](../../../fql/types/#array). There is no ordering expectation. The calling Array isn’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | seed | Generic | true | Initial state value. | | combiner | Function | true | Anonymous FQL function that aggregates the elements. | ### [](#combiner-function-parameters)Combiner function parameters: | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | accumulator | Generic | | Value returned by the previous invocation. | | current | Generic | | The current element’s value. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Generic | Aggregate of the iterable. If the iterable is empty, the seed value is returned. | ## [](#examples)Examples ```fql let iter = [1, 2, 3, 4] iter.aggregate(0, (a, b) => a + b) ``` ``` 10 ``` # `array.any()` Test if any element of an [Array](../../../fql/types/#array) matches a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig any(predicate: (A => Boolean | Null)) => Boolean ``` ## [](#description)Description Tests if any element of the calling [Array](../../../fql/types/#array) matches a provided [predicate function](../../../fql/functions/#predicates). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | predicate | Predicate function | true | Anonymous predicate function that:Accepts an Array element as its only argument. You can pass in this argument using arrow function syntax.Returns Boolean or Null.The method returns true if the predicate is true for any element in the Array. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the predicate is true for one or more elements in the Array. Otherwise, false. | ## [](#examples)Examples ```fql let iter = [1, 2, 3] iter.any(v => v == 2) ``` ``` true ``` # `array.append()` Append a provided element to an [Array](../../../fql/types/#array). ## [](#signature)Signature ```fql-sig append(element: A) => Array ``` ## [](#description)Description Appends a provided element to the calling [Array](../../../fql/types/#array). The calling Array isn’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | element | Generic | true | Element to append to the Array. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Array with the appended element. | ## [](#examples)Examples ```fql [1, 2].append(3) ``` ``` [ 1, 2, 3 ] ``` # `array.at()` Get the [Array](../../../fql/types/#array) element at a provided index. ## [](#signature)Signature ```fql-sig at(index: Number) => A ``` ## [](#description)Description Gets the [Array](../../../fql/types/#array) element located at a provided index. The method is equivalent to using `[]` indexing, such that `array.at(0)` returns the same element as `array[0]`. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | index | Number | true | Location of the Array element to return. Must be an Int.Arrays have zero-based indexes. Valid index values are zero to one less than the Array length. Invalid index values result in an error. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Generic | Array element located at the provided index. | ## [](#examples)Examples Get the element at index location four: ```fql [1, 2, 3, 4, 5].at(4) ``` ``` 5 ``` # `array.concat()` Concatenate two [Array](../../../fql/types/#array)s. ## [](#signature)Signature ```fql-sig concat(other: Array) => Array ``` ## [](#description)Description Creates an [Array](../../../fql/types/#array) by copying the calling array to a new Array and appending another Array. The calling Array and the other Array aren’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | other | Array | true | Array to append to the calling Array. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Array composed of the concatenated Arrays. | ## [](#examples)Examples This example concatenates the second [Array](../../../fql/types/#array) to the first: ```fql [0, 1, 2, 3, 4].concat([3, 2, 1, 0]) ``` ``` [ 0, 1, 2, 3, 4, 3, 2, 1, 0 ] ``` # `array.distinct()` Get the unique elements of an [Array](../../../fql/types/#array). ## [](#signature)Signature ```fql-sig distinct() => Array ``` ## [](#description)Description Gets the unique elements of the calling [Array](../../../fql/types/#array). The calling Array isn’t changed. ### [](#avoid-using-distinct-with-large-sets)Avoid using `distinct()` with large Sets Avoid using `distinct()` to process Arrays converted from large or unbounded Sets that contain 16,000 or more documents. If a Set contains 16,000 or more documents, the query requires pagination. [`array.distinct()`](./) would only be able to extract unique elements from each page of results. Instead, retrieve all field values and process them on the client side. See [Get unique field values](../../../../learn/query/patterns/get-unique-values/). ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Unique elements in the Array. | ## [](#examples)Examples ### [](#basic-example)Basic example ```fql [1, 1, 2, 3, 3].distinct() ``` ``` [ 1, 2, 3 ] ``` ### [](#enforce-unique-values-in-an-array-field)Enforce unique values in an Array field You can use [`array.distinct()`](./) in a [check constraint](../../../fsl/check/) with to enforce distinct values within a document’s [Array](../../../fql/types/#array) field. For example: ```fsl collection Customer { ... emails: Array // Check constraint. `Customer` document writes are // only allowed if the `emails` Array field contains no duplicate values. check uniqueEmails (.emails.length == .emails.distinct().length) ... } ``` The check constraint ensures the following document write is rejected: ```fql // Rejected due to check constraint. Customer.create({ emails: [ "john.doe@example.com", "john.doe@example.com", ] }) ``` # `array.drop()` Drop the first _N_ elements of an [Array](../../../fql/types/#array). ## [](#signature)Signature ```fql-sig drop(amount: Number) => Array ``` ## [](#description)Description Drops a provided number of elements from an [Array](../../../fql/types/#array), beginning at element\[0\]. This lets you "skip" the elements. The calling Array isn’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | amount | Number | true | Number of elements to drop. If this value is greater than the Array length, an empty Array is returned. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | An Array with the elements dropped. | ## [](#examples)Examples ```fql [1, 2, 3, 4, 5].drop(2) ``` ``` [ 3, 4, 5 ] ``` # `array.entries()` Add the index to each element of an [Array](../../../fql/types/#array). ## [](#signature)Signature ```fql-sig entries() => Array<[Number, A]> ``` ## [](#description)Description Adds the index to every Array element, creating an Array of each `[index, element]` pair. The calling Array isn’t changed. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array<[Number, Generic]> | List of Arrays of index:element pairs for each element of the calling Array. | ## [](#examples)Examples ```fql ['a', 'b', 'c'].entries() ``` ``` [ [ 0, "a" ], [ 1, "b" ], [ 2, "c" ] ] ``` # `array.every()` Test if every element of an [Array](../../../fql/types/#array) matches a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig every(predicate: (A => Boolean | Null)) => Boolean ``` ## [](#description)Description Tests if every element of the calling Array matches a provided [predicate function](../../../fql/functions/#predicates). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | predicate | Predicate function | true | Anonymous predicate function that:Accepts an Array element as its only argument. You can pass in this argument using arrow function syntax.Returns Boolean or Null.The method returns true if the predicate is true for every element in the Array. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the predicate evaluates to true for every element of the Array. Otherwise, false. | ## [](#examples)Examples ```fql let iter = [1, -2, 3] iter.every(v => v > 0) ``` ``` false ``` # `array.filter()` Filter an [Array](../../../fql/types/#array) using a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig filter(predicate: (A => Boolean | Null)) => Array ``` ## [](#description)Description Returns an Array containing elements of the calling Array that match a provided [predicate function](../../../fql/functions/#predicates). The calling Array isn’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | predicate | Predicate function | true | Anonymous predicate function that:Accepts an Array element as its only argument. You can pass in this argument using arrow function syntax.Returns Boolean or Null.If the predicate evaluates to true for an element, the element is included in the Array returned by the method. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Array containing elements that evaluate to true for the provided predicate.If no elements evaluate to true, the method returns an empty Array. | ## [](#examples)Examples ```fql [1, 2, 3, 4, 5].filter((x) => {x == 1 || x == 4}) ``` ``` [ 1, 4 ] ``` ## [](#see-also)See also [`array.where()`](../where/) # `array.first()` Get the first element of an [Array](../../../fql/types/#array). ## [](#signature)Signature ```fql-sig first() => A | Null ``` ## [](#description)Description Gets the first element of the calling [Array](../../../fql/types/#array). ## [](#parameters)Parameters None ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Generic | First element of the Array. | | Null | Returns Null for an empty Array. | ## [](#examples)Examples ```fql [1, 2].first() ``` ``` 1 ``` # `array.firstWhere()` Get the first element of an [Array](../../../fql/types/#array) that matches a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig firstWhere(predicate: (A => Boolean | Null)) => A | Null ``` ## [](#description)Description Gets the first element of the calling [Array](../../../fql/types/#array) that matches a provided [predicate function](../../../fql/functions/#predicates). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | predicate | Predicate function | true | Anonymous predicate function that:Accepts an Array element as its only argument. You can pass in this argument using arrow function syntax. Supports shorthand-syntax for objects and documents.Returns Boolean or Null.The method returns the first Array element for which the predicate returns true. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Generic | First element of the Array that matches the predicate. | | Null | Returned if no Array element matches the predicate or the Array is empty. | ## [](#examples)Examples ```fql let iter = [1, 2, 3, 4] iter.firstWhere(v => v > 2) ``` ``` 3 ``` # `array.flatMap()` Apply a provided [function](../../../fql/functions/) to each [Array](../../../fql/types/#array) element and flatten the resulting Array by one level. ## [](#signature)Signature ```fql-sig flatMap(mapper: (A => Array)) => Array ``` ## [](#description)Description Creates an [Array](../../../fql/types/#array) by invoking a provided mapper [function](../../../fql/functions/) on each element of the calling Array and flattening the resulting Array by one level. The Array elements are passed as a parameter to the mapper function sequentially. The calling Array isn’t changed. ### [](#array-iteration-methods)Array iteration methods FQL provides several methods for iterating over an Array. [`array.forEach()`](../foreach/), [`array.map()`](../map/), and [`array.flatMap()`](./) are similar but return different values. | Method | Return value | | --- | --- | --- | --- | | array.forEach() | Null. | | array.map() | Array. | | array.flatMap() | Array, flattened by one level. | For examples, see: * [`array.forEach()` vs. `array.map()`](../map/#foreach-vs-map) * [`array.map()` vs. `array.flatMap()`](#map-vs-flatmap) ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | mapper | Function | true | Function to invoke on each element of the calling Array. Each element is passed to the mapper function as an argument. The function must return an Array. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Array containing the result of invoking the mapper function on each element of the calling Array. The resulting Array is flattened by one level. | ## [](#examples)Examples ### [](#basic-example)Basic example ```fql [1, 2, 3].flatMap((val) => [val, val * 2, val * 3]) ``` ``` [ 1, 2, 3, 2, 4, 6, 3, 6, 9 ] ``` ### [](#map-vs-flatmap)`array.map()` vs. `array.flatMap()` [`array.flatMap()`](./) is similar to [`array.map()`](../map/), except [`array.flatMap()`](./) also flattens the resulting Array by one level. In the following example, [`array.map()`](../map/) returns a two-dimensional Array: ```fql // Create an Array of product names. let products = [ "limes", "avocados" ] // Use `map()` to get a Set of matching `Product` collection // documents for each name. Convert each Set to an Array. products.map(product => { Product.byName(product).toArray() }) ``` ``` // Two-dimensional Array. [ [ { id: "777", coll: Product, ts: Time("2099-10-02T19:37:36.357Z"), name: "limes", description: "Conventional, 16 oz bag", price: 299, stock: 30, category: Category("789") } ], [ { id: "444", coll: Product, ts: Time("2099-10-02T19:37:36.357Z"), name: "avocados", description: "Conventional Hass, 4ct bag", price: 399, stock: 1000, category: Category("789") } ] ] ``` To flatten the result to a one-dimensional array, use [`array.flatMap()`](./) instead: ```fql // Create an Array of product names. let products = [ "limes", "avocados" ] // Use `flatMap()` to get a Set of matching `Product` collection // documents for each name. Convert each Set to an Array. // Then flatten the resulting Array by one level. products.flatMap(product => { Product.byName(product).toArray() }) ``` ``` // One-dimensional Array. [ { id: "777", coll: Product, ts: Time("2099-10-02T19:37:36.357Z"), name: "limes", description: "Conventional, 16 oz bag", price: 299, stock: 30, category: Category("789") }, { id: "444", coll: Product, ts: Time("2099-10-02T19:37:36.357Z"), name: "avocados", description: "Conventional Hass, 4ct bag", price: 399, stock: 1000, category: Category("789") } ] ``` # `array.flatten()` Flatten an [Array](../../../fql/types/#array) by one level. ## [](#signature)Signature ```fql-sig flatten() => Array ``` ## [](#description)Description Flattens an [Array](../../../fql/types/#array) by one level, reducing its dimensions by one. The calling Array must be a multi-dimentional Array. If there are non-Array elements in the Array, an error is returned. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Flattened Array. | ## [](#examples)Examples ```fql [[1, 2], [3, 4]].flatten() ``` ``` [ 1, 2, 3, 4 ] ``` # `array.fold()` Reduce the [Array](../../../fql/types/#array) to a single, accumulated value by applying a provided [function](../../../fql/functions/) to each element. Iterates through elements from left to right. Uses a provided seed as the initial value. ## [](#signature)Signature ```fql-sig fold(seed: B, reducer: (B, A) => B) => B ``` ## [](#description)Description Iterates through each element in an [Array](../../../fql/types/#array) to perform a rolling operation. For example, you can use `fold()` to calculate a rolling sum, concatenate elements, or perform complex transformations. `fold()` calls a reducer callback function on every element of the Array from left to right. The reducer function takes two arguments: * The accumulator that holds the running result from previous iterations. For the first iteration, a seed value serves as the initial accumulator. * The current element’s value from the Array. The method returns the result of the last iteration. The calling Array isn’t changed. ### [](#fold-family-methods)Fold family methods FQL supports several methods for [folds](https://en.wikipedia.org/wiki/Fold_\(higher-order_function\)), which iteratively reduce an Array to a single value. These methods include: * [`array.fold()`](./) * [`array.foldRight()`](../foldright/) * [`array.reduce()`](../reduce/) * [`array.reduceRight()`](../reduceright/) The methods are similar but have the following differences: * [`array.fold()`](./) and [`array.foldRight()`](../foldright/) accept an initial _seed_ value and use it as the initial _accumulator_. [`array.reduce()`](../reduce/) and [`array.reduceRight()`](../reduceright/) use the array’s first element as the initial _accumulator_. * [`array.fold()`](./) and [`array.reduce()`](../reduce/) iterate through the Array’s elements from left to right. [`array.foldRight()`](../foldright/) and [`array.reduceRight()`](../reduceright/) iterate through the Array’s elements from right to left. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | seed | Generic | true | Initial accumulator value provided to the reducer function. | | reducer | Function | true | Anonymous FQL function to call on each element of the Array. | ### [](#reducer-function-arguments)Reducer function arguments: | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | accumulator | Generic | true | Value returned by the previous reducer function call. On the first call, seed is passed as the accumulator. | | current | Generic | true | The current element’s value. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Generic | Result of the last reducer function call. For an empty Array, the seed is returned. | ## [](#examples)Examples ```fql let iter = [1, 2, 3] iter.fold(100, (value, elem) => value + elem) ``` ``` 106 ``` ### [](#group-by-operation)Group by operation [FQL](../../../../learn/query/) doesn’t provide a built-in `GROUP BY` operation. However, you use `fold()` in an [anonymous FQL function](../../../fql/functions/) or a [user-defined function (UDF)](../../../../learn/schema/user-defined-functions/) to achieve the same result. As an FQL function: ```fql // Defines an anonymous `groupBy()` function. // `groupBy()` two arguments: // * `set`: Set or Array containing data to group // * `key_fn`: Grouping key for each element let groupBy = (set, key_fn) => { // Calls the `fold()` function on the `set` // Set or Array. set.fold( // Start with an empty object. {}, (acc, val) => { // For each value, get a key using the `key_fn` arg. let key = key_fn(val) let existing_group = acc[key] ?? [] // Append the current value to the Set or // Array for that key. let new_group = existing_group.append(val) let new_entry = Object.fromEntries([ [key, new_group] ]) // Return an object with grouped results. Object.assign(acc, new_entry) } ) } // Call the `groupBy()` function. // Groups `Product` documents by category name. groupBy(Product.all(), .category!.name) ``` You can also define a `groupBy()` UDF. This lets you reuse the function across multiple queries. You create and manage a UDF as an FSL [function schema](../../../fsl/function/): ```fsl // Defines the `groupBy()` UDF. // `groupBy()` two arguments: // * `set`: Set or Array containing data to group // * `key_fn`: Grouping key for each element function groupBy (set, key_fn) { // Calls the `fold()` function on the `set` // Set or Array. set.fold( // Start with an empty object. {}, (acc, val) => { // For each value, get a key using the `key_fn` arg. let key: String = key_fn(val) let existing_group = acc[key] ?? [] // Append the current value to the Set or // Array for that key. let new_group = existing_group.append(val) let new_entry = Object.fromEntries([ [key, new_group] ]) // Return an object with grouped results. Object.assign(acc, new_entry) } ) } ``` You can create and manage schema using any of the following: * The [Fauna CLI](../../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../../learn/schema/manage-schema/#fql) For additional examples using the `groupBy()` UDF, see [Group By: Aggregate data in Fauna](../../../../learn/query/patterns/group-by/). # `array.foldRight()` Reduce an [Array](../../../fql/types/#array) to a single, accumulated value by applying a provided [function](../../../fql/functions/) to each element. Iterates through elements from right to left. Uses a provided seed as the initial value. ## [](#signature)Signature ```fql-sig foldRight(seed: B, reducer: (B, A) => B) => B ``` ## [](#description)Description Iterates through each element in an [Array](../../../fql/types/#array) to perform a rolling operation. For example, you can use `foldRight()` to calculate a rolling sum, concatenate elements, or perform complex transformations. `foldRight()` calls a reducer callback function on every element of the Array from right to left. The reducer function takes two arguments: * The accumulator that holds the running result from previous iterations. For the first iteration, a seed value serves as the initial accumulator. * The current element’s value from the Array. The method returns the result of the last iteration. The calling Array isn’t changed. ### [](#fold-family-methods)Fold family methods FQL supports several methods for [folds](https://en.wikipedia.org/wiki/Fold_\(higher-order_function\)), which iteratively reduce an Array to a single value. These methods include: * [`array.fold()`](../fold/) * [`array.foldRight()`](./) * [`array.reduce()`](../reduce/) * [`array.reduceRight()`](../reduceright/) The methods are similar but have the following differences: * [`array.fold()`](../fold/) and [`array.foldRight()`](./) accept an initial _seed_ value and use it as the initial _accumulator_. [`array.reduce()`](../reduce/) and [`array.reduceRight()`](../reduceright/) use the array’s first element as the initial _accumulator_. * [`array.fold()`](../fold/) and [`array.reduce()`](../reduce/) iterate through the Array’s elements from left to right. [`array.foldRight()`](./) and [`array.reduceRight()`](../reduceright/) iterate through the Array’s elements from right to left. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | seed | Generic | true | Initial accumulator value provided to the reducer function. | | reducer | Function | true | Anonymous FQL function to call on each element of the Array. | ### [](#reducer-function-arguments)Reducer function arguments: | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | accumulator | Generic | true | Value returned by the previous reducer function call. On the first call, seed is passed as the accumulator. | | current | Generic | true | The current element’s value. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Generic | Result of the last reducer function call. For an empty Array, the seed is returned. | ## [](#examples)Examples ```fql let iter = [1, 2, 3] iter.foldRight(100, (value, elem) => value + elem) ``` ``` 106 ``` # `array.forEach()` Run a provided [function](../../../fql/functions/) on each element of an [Array](../../../fql/types/#array). Can perform writes. ## [](#signature)Signature ```fql-sig forEach(callback: (A => Any)) => Null ``` ## [](#description)Description Iterates over all elements in the [Array](../../../fql/types/#array) and executes a provided callback function on each element. It is used for mutations or writing documents based on each Array element. The calling Array isn’t changed. ### [](#array-iteration-methods)Array iteration methods FQL provides several methods for iterating over an Array. [`array.forEach()`](./), [`array.map()`](../map/), and [`array.flatMap()`](../flatmap/) are similar but return different values. | Method | Return value | | --- | --- | --- | --- | | array.forEach() | Null. | | array.map() | Array. | | array.flatMap() | Array, flattened by one level. | For examples, see: * [`array.forEach()` vs. `array.map()`](#foreach-vs-map) * [`array.map()` vs. `array.flatMap()`](../map/#map-vs-flatmap) ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | callback | Function | true | Anonymous FQL function to call on each element of the Array. Each call returns Null. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Null | This method always returns Null. | ## [](#examples)Examples ### [](#basic)Basic ```fql // Create an Array of objects that contain document data. let customers = [ { "name": "Ruby Von Rails", "email": "ruby@example.com", "address": { "street": "87856 Mendota Court", "city": "Washington", "state": "DC", "postalCode": "20220", "country": "US" } }, { "name": "Scott Chegg", "email": "chegg@example.com", "address": { "street": "87856 Mendota Court", "city": "Washington", "state": "DC", "postalCode": "20220", "country": "US" } }, { "name": "Hilary Ouse", "email": "ouse@example.com", "address": { "street": "87856 Mendota Court", "city": "Washington", "state": "DC", "postalCode": "20220", "country": "US" } } ] // Use `forEach()` to create a `Customer` collection document for each // element of the previous Array. customers.forEach(doc => Customer.create({ doc })) // `forEach()` returns `null`. ``` ``` null ``` Although it returns `null`, [`array.forEach()`](./) still performed the requested operations. To verify: ```fql // Get all `Customer` collection documents Customer.all() ``` ``` { // The results contain `Customer` documents created by // the previous `forEach()` call. data: [ { id: "410652919999758409", coll: Customer, ts: Time("2099-10-02T16:53:12.770Z"), cart: null, orders: "hdW...", name: "Ruby Von Rails", email: "ruby@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } }, { id: "410652919999759433", coll: Customer, ts: Time("2099-10-02T16:53:12.770Z"), cart: null, orders: "hdW...", name: "Scott Chegg", email: "chegg@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } }, { id: "410652919999760457", coll: Customer, ts: Time("2099-10-02T16:53:12.770Z"), cart: null, orders: "hdW...", name: "Hilary Ouse", email: "ouse@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } } ] } ``` ### [](#foreach-vs-map)`array.forEach()` vs. `array.map()` You can use both [`array.forEach()`](./) and [`array.map()`](../map/) to iterate through an Array. However, each method returns a different value type. [`array.forEach()`](./) returns `null`: ```fql // Create an Array of objects that contain category data. let categories = [ { "name": "Electronics", "description": "Bargain electronics!" }, { "name": "Books", "description": "Bargain books!" } ] // Use `forEach()` to create a `Category` collection document for each // element of the previous Array. categories.forEach(doc => Category.create({ doc })) ``` ``` null ``` Although it returns `null`, [`array.forEach()`](./) still performed the requested operations. To verify: ```fql // Get all `Category` collection documents Category.all() ``` ``` // The results contain `Customer` documents created by // the previous `forEach()` call. { data: [ ... { id: "410665732340187209", coll: Category, ts: Time("2099-10-02T20:16:51.560Z"), products: "hdW...", name: "Electronics", description: "Bargain electronics!" }, { id: "410665732340188233", coll: Category, ts: Time("2099-10-02T20:16:51.560Z"), products: "hdW...", name: "Books", description: "Bargain books!" } ] } ``` [`array.map()`](../map/) returns an [Array](../../../fql/types/#array): ```fql // Create an Array of objects that contain category data. let categories = [ { "name": "Movies", "description": "Bargain movies!" }, { "name": "Music", "description": "Bargain music!" } ] // Use `map()` to create a `Category` collection document for each // element of the previous Array. categories.map(doc => Category.create({ doc })) ``` In this case, the Array contains documents created by the [`array.map()`](../map/) call: ``` [ { id: "410655308219678797", coll: Category, ts: Time("2099-10-02T17:31:10.366Z"), products: "hdW...", name: "Movies", description: "Bargain movies!" }, { id: "410655308219679821", coll: Category, ts: Time("2099-10-02T17:31:10.366Z"), products: "hdW...", name: "Music", description: "Bargain music!" } ] ``` # `array.includes()` Test if the [Array](../../../fql/types/#array) includes a provided element. ## [](#signature)Signature ```fql-sig includes(element: A) => Boolean ``` ## [](#description)Description Tests if the [Array](../../../fql/types/#array) includes a provided element. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | element | Generic | true | Element to check the Array for. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the Array contains the provided element. Otherwise, false. | ## [](#examples)Examples ```fql let iter = [1, 2, 3] iter.includes(2) ``` ``` true ``` # `array.indexOf()` Get the index of the first [Array](../../../fql/types/#array) element that matches a provided value. ## [](#signature)Signature ```fql-sig indexOf(element: A) => Number | Null indexOf(element: A, start: Number) => Number | Null ``` ## [](#description)Description Searches, left-to-right, for the first element that matches a provided value and returns the index of the element if a match is found. If an optional start index is provided, the method searches left-to-right starting at the start index and returns the first matching index (inclusive). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | element | Generic | true | Value to search for in the Array elements. | | start | Number | | Starting index (inclusive) of the left-to-right search. Must be an Int. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Number | Index of the element that matches the provided value. | | Null | Returned if a match isn’t found. | ## [](#examples)Examples ```fql ['a', 'b', 'c', 'b'].indexOf('c') ``` ``` 2 ``` # `array.indexWhere()` Get the index of the first [Array](../../../fql/types/#array) element that matches a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig indexWhere(predicate: (A => Boolean | Null)) => Number | Null indexWhere(predicate: A => Boolean | Null, start: Number) => Number | Null ``` ## [](#description)Description Searches, left-to-right, for the first element that matches a provided [predicate function](../../../fql/functions/#predicates) and returns the index of the element if a match is found. If the optional start index is provided, the method searches left-to-right starting at index and returns the first matching index (inclusive). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | predicate | Function | true | Anonymous predicate function that:Accepts an Array element as its only argument. You can pass in this argument using arrow function syntax.Returns Boolean or Null.If the predicate evaluates to true for an element, the element is considered a match. | | start | Number | | Starting index (inclusive) of the left-to-right search. Must be an Int. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Number | Index of the element that matches the provided predicate. | | Null | Returned if a match isn’t found. | ## [](#examples)Examples ```fql ['a', 'b', 'c', 'b'].indexWhere(v => v == 'c') ``` ``` 2 ``` # `array.isEmpty()` Test if an [Array](../../../fql/types/#array) is empty. ## [](#signature)Signature ```fql-sig isEmpty() => Boolean ``` ## [](#description)Description Tests if the calling [Array](../../../fql/types/#array) is empty and contains no elements. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the Array is empty. Otherwise, false. | ## [](#examples)Examples ```fql let iter = [] iter.isEmpty() ``` ``` true ``` # `array.last()` Get the last element of an [Array](../../../fql/types/#array). ## [](#signature)Signature ```fql-sig last() => A | Null ``` ## [](#description)Description Returns the last element of the [Array](../../../fql/types/#array). The method reverses the Array and gets the first item. ## [](#parameters)Parameters None ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Generic | Last element of the Array. | | Null | Returned if the Array is empty. | ## [](#examples)Examples ```fql let iter = [1] iter.last() ``` ``` 1 ``` # `array.lastIndexOf()` Get the index of the last [Array](../../../fql/types/#array) element that matches a provided value. ## [](#signature)Signature ```fql-sig lastIndexOf(element: A) => Number | Null lastIndexOf(element: A, end: Number) => Number | Null ``` ## [](#description)Description Searches, right-to-left, for the first element that matches a provided value and returns the index of the element if a match is found. If an optional end index is provided, the method searches right-to-left starting at the end index and returns the first matching index (inclusive). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | element | Generic | true | Value to search for in the Array elements. | | start | Number | | Starting index (inclusive) of the right-to-left search. Must be an Int. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Number | Index of the element that matches the provided value. | | Null | Returned if a match isn’t found. | ## [](#examples)Examples ```fql ['a', 'b', 'c', 'b'].lastIndexOf('c') ``` ``` 2 ``` # `array.lastIndexWhere()` Get the index of the last [Array](../../../fql/types/#array) element that matches a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig lastIndexWhere(predicate: (A => Boolean | Null)) => Number | Null lastIndexWhere(predicate: A => Boolean | Null, end: Number) => Number | Null ``` ## [](#description)Description Searches, right-to-left, for the first element that matches a provided [predicate function](../../../fql/functions/#predicates) and returns the index of the element if a match is found. If the optional start index is provided, the method searches right-to-left starting at index and returns the first matching index (inclusive). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | predicate | Function | true | Anonymous predicate function that:Accepts an Array element as its only argument. You can pass in this argument using arrow function syntax.Returns Boolean or Null.If the predicate evaluates to true for an element, the element is considered a match. | | end | Number | | Starting index (inclusive) of the right-to-left search. Must be an Int. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Number | Index of the element that matches the provided predicate. | | Null | Returned if a match isn’t found. | ## [](#examples)Examples ```fql ['a', 'b', 'c', 'b'].lastIndexWhere(v => v == 'b') ``` ``` 3 ``` # `array.lastWhere()` Get the last element of an [Array](../../../fql/types/#array) that matches a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig lastWhere(predicate: (A => Boolean | Null)) => A | Null ``` ## [](#description)Description Gets the last element of the calling [Array](../../../fql/types/#array) that matches a provided [predicate function](../../../fql/functions/#predicates). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | predicate | Predicate function | true | Anonymous predicate function that:Accepts an Array element as its only argument. You can pass in this argument using arrow function syntax. Supports shorthand-syntax for objects and documents.Returns Boolean or Null.The method returns the last Array element for which the predicate returns true. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Generic | Last element of the Array that matches the predicate. | | Null | Returned if no Array element matches the predicate or the Array is empty. | ## [](#examples)Examples ```fql let iter = [1, 2, 3, 4] iter.lastWhere(v => v > 2) ``` ``` 4 ``` # `array.map()` Apply a provided [function](../../../fql/functions/) to each element of an [Array](../../../fql/types/#array). Can’t perform writes. ## [](#signature)Signature ```fql-sig map(mapper: (A => B)) => Array ``` ## [](#description)Description Creates an [Array](../../../fql/types/#array) by applying a mapper function to each element in the calling [Array](../../../fql/types/#array). Writes are not permitted. The calling Array isn’t changed. ### [](#array-iteration-methods)Array iteration methods FQL provides several methods for iterating over an Array. [`array.forEach()`](../foreach/), [`array.map()`](./), and [`array.flatMap()`](../flatmap/) are similar but return different values. | Method | Return value | | --- | --- | --- | --- | | array.forEach() | Null. | | array.map() | Array. | | array.flatMap() | Array, flattened by one level. | For examples, see: * [`array.forEach()` vs. `array.map()`](#foreach-vs-map) * [`array.map()` vs. `array.flatMap()`](#map-vs-flatmap) ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | mapper | Function | true | Anonymous FQL function to call on each element of the calling Array. Each call returns a value that’s returned in the result Array. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Array containing values returned by each mapper function call. | ## [](#examples)Examples ### [](#basic)Basic 1. Create an [Array](../../../fql/types/#array) by mapping [Array](../../../fql/types/#array) elements to sub[Array](../../../fql/types/#array)s that are constructed from the element value and the element value plus one, using an anonymous [Function](../../../fql/types/#func): ```fql [1, 2, 3].map(x => [x, x + 1]) ``` ``` [ [ 1, 2 ], [ 2, 3 ], [ 3, 4 ] ] ``` 2. Create an [Array](../../../fql/types/#array) by passing a top-level object method as the mapping [Function](../../../fql/types/#func): ```fql [{name: "D"}, {name: "E"}].map(Collection.create) ``` ``` [ { name: "D", coll: Collection, ts: Time("2099-02-18T20:06:25.620Z"), indexes: {}, constraints: [], history_days: 0 }, { name: "E", coll: Collection, ts: Time("2099-02-18T20:06:25.620Z"), indexes: {}, constraints: [], history_days: 0 } ] ``` ### [](#foreach-vs-map)`array.forEach()` vs. `array.map()` You can use both [`array.forEach()`](../foreach/) and [`array.map()`](./) to iterate through an Array. However, each method returns a different value type. [`array.forEach()`](../foreach/) returns `null`: ```fql // Create an Array of objects that contain category data. let categories = [ { "name": "Electronics", "description": "Bargain electronics!" }, { "name": "Books", "description": "Bargain books!" } ] // Use `forEach()` to create a `Category` collection document for each // element of the previous Array. categories.forEach(doc => Category.create({ doc })) ``` ``` null ``` Although it returns `null`, [`array.forEach()`](../foreach/) still performed the requested operations. To verify: ```fql // Get all `Category` collection documents Category.all() ``` ``` // The results contain `Customer` documents created by // the previous `forEach()` call. { data: [ ... { id: "410665732340187209", coll: Category, ts: Time("2099-10-02T20:16:51.560Z"), products: "hdW...", name: "Electronics", description: "Bargain electronics!" }, { id: "410665732340188233", coll: Category, ts: Time("2099-10-02T20:16:51.560Z"), products: "hdW...", name: "Books", description: "Bargain books!" } ] } ``` [`array.map()`](./) returns an [Array](../../../fql/types/#array): ```fql // Create an Array of objects that contain category data. let categories = [ { "name": "Movies", "description": "Bargain movies!" }, { "name": "Music", "description": "Bargain music!" } ] // Use `map()` to create a `Category` collection document for each // element of the previous Array. categories.map(doc => Category.create({ doc })) ``` In this case, the Array contains documents created by the [`array.map()`](./) call: ``` [ { id: "410655308219678797", coll: Category, ts: Time("2099-10-02T17:31:10.366Z"), products: "hdW...", name: "Movies", description: "Bargain movies!" }, { id: "410655308219679821", coll: Category, ts: Time("2099-10-02T17:31:10.366Z"), products: "hdW...", name: "Music", description: "Bargain music!" } ] ``` ### [](#map-vs-flatmap)`array.map()` vs. `array.flatMap()` [`array.flatMap()`](../flatmap/) is similar to [`array.map()`](./), except [`array.flatMap()`](../flatmap/) also flattens the resulting Array by one level. In the following example, [`array.map()`](./) returns a two-dimensional Array: ```fql // Create an Array of product names. let products = [ "limes", "avocados" ] // Use `map()` to get a Set of matching `Product` collection // documents for each name. Convert each Set to an Array. products.map(product => { Product.byName(product).toArray() }) ``` ``` // Two-dimensional Array. [ [ { id: "777", coll: Product, ts: Time("2099-10-02T19:37:36.357Z"), name: "limes", description: "Conventional, 16 oz bag", price: 299, stock: 30, category: Category("789") } ], [ { id: "444", coll: Product, ts: Time("2099-10-02T19:37:36.357Z"), name: "avocados", description: "Conventional Hass, 4ct bag", price: 399, stock: 1000, category: Category("789") } ] ] ``` To flatten the result to a one-dimensional array, use [`array.flatMap()`](../flatmap/) instead: ```fql // Create an Array of product names. let products = [ "limes", "avocados" ] // Use `flatMap()` to get a Set of matching `Product` collection // documents for each name. Convert each Set to an Array. // Then flatten the resulting Array by one level. products.flatMap(product => { Product.byName(product).toArray() }) ``` ``` // One-dimensional Array. [ { id: "777", coll: Product, ts: Time("2099-10-02T19:37:36.357Z"), name: "limes", description: "Conventional, 16 oz bag", price: 299, stock: 30, category: Category("789") }, { id: "444", coll: Product, ts: Time("2099-10-02T19:37:36.357Z"), name: "avocados", description: "Conventional Hass, 4ct bag", price: 399, stock: 1000, category: Category("789") } ] ``` # `array.nonEmpty()` Test if an [Array](../../../fql/types/#array) is not empty. ## [](#signature)Signature ```fql-sig nonEmpty() => Boolean ``` ## [](#description)Description Tests if the calling [Array](../../../fql/types/#array) is not empty. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the Array is not empty. Otherwise, false. | ## [](#examples)Examples ```fql let iter = [] iter.nonEmpty() ``` ``` false ``` # `array.order()` Sort an [Array](../../../fql/types/#array)'s elements. ## [](#signature)Signature ```fql-sig order(ordering: ...(A => Any) & {}) => Array ``` ## [](#description)Description Creates a sorted [Array](../../../fql/types/#array) by applying one or more sorting criteria to the calling [Array](../../../fql/types/#array). You define each sorting criterion by wrapping `asc()` (ascending) or `desc()` (descending) around a read-only [anonymous function](../../../fql/functions/). The first criterion has the highest sorting priority, with priority decreasing for each subsequent criterion. The calling Array remains unchanged. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | ordering | Generic | | One or more sorting criteria, separated by commas.Each criterion is a read-only anonymous function, optionally wrapped in asc() (ascending) or desc() (descending) to indicate sort order.If neither asc() or desc() is provided, asc() is used by default.The anonymous function is passed each Array element as an argument. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | New Array with elements in requested order. | ## [](#examples)Examples Order the Array elements returned by the function in ascending order. ```fql [3, 1, 2, 4].order(asc(v => v)) ``` ``` [ 1, 2, 3, 4 ] ``` # `array.prepend()` Prepend an element to an [Array](../../../fql/types/#array). ## [](#signature)Signature ```fql-sig prepend(element: A) => Array ``` ## [](#description)Description Prepends an element to an [Array](../../../fql/types/#array) instance. The calling Array isn’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | element | Generic | true | Element to prepend to the Array. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Array with the prepended element. | ## [](#examples)Examples ```fql [2, 3].prepend(1) ``` ``` [ 1, 2, 3 ] ``` # `array.reduce()` Reduce an [Array](../../../fql/types/#array) to a single, accumulated value by applying a provided [function](../../../fql/functions/) to each element. Iterates through elements from left to right. Uses the first element as the initial value. ## [](#signature)Signature ```fql-sig reduce(reducer: ((A, A) => A)) => A | Null ``` ## [](#description)Description Iterates through each element in an [Array](../../../fql/types/#array) to perform a rolling operation. For example, you can use `reduce()` to calculate a rolling sum, concatenate elements, or perform complex transformations. `reduce()` calls a reducer callback function on every element of the Array from left to right. The reducer function takes two arguments: * The accumulator that holds the running result from previous iterations. The first element in the Array serves as the initial accumulator. * The current element’s value. The method returns the result of the last iteration. The calling Array isn’t changed. ### [](#fold-family-methods)Fold family methods FQL supports several methods for [folds](https://en.wikipedia.org/wiki/Fold_\(higher-order_function\)), which iteratively reduce an Array to a single value. These methods include: * [`array.fold()`](../fold/) * [`array.foldRight()`](../foldright/) * [`array.reduce()`](./) * [`array.reduceRight()`](../reduceright/) The methods are similar but have the following differences: * [`array.fold()`](../fold/) and [`array.foldRight()`](../foldright/) accept an initial _seed_ value and use it as the initial _accumulator_. [`array.reduce()`](./) and [`array.reduceRight()`](../reduceright/) use the array’s first element as the initial _accumulator_. * [`array.fold()`](../fold/) and [`array.reduce()`](./) iterate through the Array’s elements from left to right. [`array.foldRight()`](../foldright/) and [`array.reduceRight()`](../reduceright/) iterate through the Array’s elements from right to left. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | reducer | Function | true | Anonymous FQL function to call on each element in the Array. | ### [](#reducer-function-parameters)Reducer function parameters: | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | accumulator | Any | true | Value returned by the previous reducer function call. On the first call, seed is passed as the accumulator. | | current | Any | true | The current element’s value. | ## [](#return)Return value One of: | Type | Description | | --- | --- | --- | --- | | Generic | Result of the last reducer function call. | | Null | Returns Null if the calling Array is empty. | ## [](#examples)Examples Reduce the [Array](../../../fql/types/#array) elements to a single value: ```fql ["A", "B", "C"].reduce((prev, cur) => prev + cur) ``` ``` "ABC" ``` # `array.reduceRight()` Reduce an [Array](../../../fql/types/#array) to a single, accumulated value by applying a provided [function](../../../fql/functions/) to each element. Iterates through elements from right to left. Uses the first element as the initial value. ## [](#signature)Signature ```fql-sig reduceRight(reducer: ((A, A) => A)) => A | Null ``` ## [](#description)Description Iterates through each element in an [Array](../../../fql/types/#array) to perform a rolling operation. For example, you can use `reduceRight()` to calculate a rolling sum, concatenate elements, or perform complex transformations. `reduceRight()` calls a reducer callback function on every element of the Array from right to left. The reducer function takes two arguments: * The accumulator that holds the running result from previous iterations. The last element in the Array serves as the initial accumulator. * The current element’s value. The method returns the result of the last iteration. The calling Array isn’t changed. ### [](#fold-family-methods)Fold family methods FQL supports several methods for [folds](https://en.wikipedia.org/wiki/Fold_\(higher-order_function\)), which iteratively reduce an Array to a single value. These methods include: * [`array.fold()`](../fold/) * [`array.foldRight()`](../foldright/) * [`array.reduce()`](../reduce/) * [`array.reduceRight()`](./) The methods are similar but have the following differences: * [`array.fold()`](../fold/) and [`array.foldRight()`](../foldright/) accept an initial _seed_ value and use it as the initial _accumulator_. [`array.reduce()`](../reduce/) and [`array.reduceRight()`](./) use the array’s first element as the initial _accumulator_. * [`array.fold()`](../fold/) and [`array.reduce()`](../reduce/) iterate through the Array’s elements from left to right. [`array.foldRight()`](../foldright/) and [`array.reduceRight()`](./) iterate through the Array’s elements from right to left. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | reducer | Function | true | Anonymous FQL function to call on each element in the Array. | ### [](#reducer-function-parameters)Reducer function parameters: | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | accumulator | Any | true | Value returned by the previous reducer function call. On the first call, seed is passed as the accumulator. | | current | Any | true | The current element’s value. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Generic | Result of the last reducer function call. | | Null | Returns Null if the calling Array is empty. | ## [](#examples)Examples Reduce the [Array](../../../fql/types/#array) elements, right to left: ```fql ["A", "B", "C"].reduceRight((prev, cur) => prev + cur) ``` ``` "CBA" ``` # `array.reverse()` Reverse the order of an [Array](../../../fql/types/#array)'s elements. ## [](#signature)Signature ```fql-sig reverse() => Array ``` ## [](#description)Description Reverses the order of the calling [Array](../../../fql/types/#array)'s elements. The calling Array isn’t changed. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Array containing reversed elements. | ## [](#examples)Examples ```fql [1, 2, 3].reverse() ``` ``` [ 3, 2, 1 ] ``` # `array.slice()` Get a subset of an [Array](../../../fql/types/#array)'s elements based on provided indexes. ## [](#signature)Signature ```fql-sig slice(from: Number) => Array slice(from: Number, until: Number) => Array ``` ## [](#description)Description Creates an [Array](../../../fql/types/#array) from the calling [Array](../../../fql/types/#array) by copying the [Array](../../../fql/types/#array) elements from a provided start index (inclusive) up to the end index (exclusive). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | from | Number | true | Zero-based Array element index to start copying (inclusive). Must be an Int. A negative start index counts from Array element zero. | | until | Int | | Zero-based Array element index to end copying (exclusive). Must be an Int.When the end index is larger than the Array length, all elements are copied from start index to the last Array element. If end index is less than or equal to the start index, an empty Array is returned. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Array with elements from the start index to the end index (inclusive) of the calling Array. | ## [](#examples)Examples Create an [Array](../../../fql/types/#array) of the second and third elements of the calling [Array](../../../fql/types/#array): ```fql [1, 2, 3, 4, 5].slice(1,3) ``` ``` [ 2, 3 ] ``` # `array.take()` Get the first _N_ elements of an [Array](../../../fql/types/#array). ## [](#signature)Signature ```fql-sig take(limit: Number) => Array ``` ## [](#description)Description Takes the first _N_ elements from the calling [Array](../../../fql/types/#array) and returns them as a new [Array](../../../fql/types/#array). If there are fewer values than _N_ elements in the [Array](../../../fql/types/#array), all elements are returned. The calling Array isn’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | limit | Number | true | Number of elements to return from the start of the Array. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Array containing the requested elements. | ## [](#examples)Examples ```fql [1, 2, 3, 4, 5].take(2) ``` ``` [ 1, 2 ] ``` # `array.toSet()` Convert an [Array](../../../fql/types/#array) to a [Set](../../../fql/types/#set). ## [](#signature)Signature ```fql-sig toSet() => Set ``` ## [](#description)Description Converts the calling [Array](../../../fql/types/#array) to a [Set](../../../fql/types/#set). The calling Array isn’t changed. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set representation of the Array instance. | ## [](#examples)Examples ```fql [1, 2].toSet() ``` ``` { data: [ 1, 2 ] } ``` # `array.toString()` Convert an Array to a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig toString() => String ``` ## [](#description)Description Converts an Array to a [String](../../../fql/types/#string) representation. The calling Array isn’t changed. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | String representation of the Array. | ## [](#examples)Examples ```fql [1, 2].toString() ``` ``` "[1, 2]" ``` # `array.where()` Get the elements of an [Array](../../../fql/types/#array) that match a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig where(predicate: (A => Boolean | Null)) => Array ``` ## [](#description)Description Returns an [Array](../../../fql/types/#array) of elements from the calling [Array](../../../fql/types/#array) that match a provided [predicate function](../../../fql/functions/#predicates). The calling Array isn’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | predicate | Predicate function | Yes | Anonymous predicate function that:Accepts an Array element as its only argument. Supports shorthand-syntax for objects and documents.Returns a Boolean or NullThe method returns an Array of elements for which the predicate returns true. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Array containing elements of the calling Array that match the predicate. If there are no matching elements, the Array is empty. | ## [](#examples)Examples ```fql [1, 2, 3, 4].where(v => v > 2) ``` ``` [ 3, 4 ] ``` ## [](#see-also)See also [`array.filter()`](../filter/) # Bytes [Bytes](../../fql/types/#bytes) methods and properties. ## [](#description)Description Bytes methods let you create and manipulate FQL [Bytes](../../fql/types/#bytes) values. A [Bytes](../../fql/types/#bytes) value stores a byte array, represented as a Base64-encoded string. You can use [Bytes](../../fql/types/#bytes) to store binary data in a Fauna database. ## [](#static-methods)Static methods | Method | Description | | --- | --- | --- | --- | | Bytes() | Convert a Base64-encoded string to an FQL Bytes value. | | Bytes.fromBase64() | Convert a Base64-encoded string to an FQL Bytes value. | ## [](#instance-methods)Instance methods | Method | Description | | --- | --- | --- | --- | | bytes.toBase64() | Convert an FQL Bytes value to a Base64-encoded string. | | bytes.toString() | Convert an FQL Bytes value to a Base64-encoded string. | # `Bytes()` Convert a Base64-encoded string to an FQL [Bytes](../../../fql/types/#bytes) value. ## [](#signature)Signature ```fql-sig Bytes(encoded: String) => Bytes ``` ## [](#description)Description Converts a Base64-encoded string to an FQL [Bytes](../../../fql/types/#bytes) value. A [Bytes](../../../fql/types/#bytes) value stores a byte array, represented as a Base64-encoded string. You can use [Bytes](../../../fql/types/#bytes) to store binary data in a Fauna database. This method is equivalent to [`Bytes.fromBase64()`](../static-frombase64/). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | encoded | String | true | Base64-encoded string representing a byte array. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Bytes | FQL Bytes value. | ## [](#examples)Examples ```fql Bytes("SGVsbG8sIEZhdW5hIQ==") ``` The method returns an FQL [Bytes](../../../fql/types/#bytes) value: ``` Bytes("SGVsbG8sIEZhdW5hIQ==") ``` # `Bytes.fromBase64()` Convert a Base64-encoded string to an FQL [Bytes](../../../fql/types/#bytes) value. ## [](#signature)Signature ```fql-sig Bytes.fromBase64(encoded: String) => Bytes ``` ## [](#description)Description Converts a Base64-encoded string to an FQL [Bytes](../../../fql/types/#bytes) value. A [Bytes](../../../fql/types/#bytes) value stores a byte array, represented as a Base64-encoded string. You can use [Bytes](../../../fql/types/#bytes) to store binary data in a Fauna database. This method is equivalent to [`Bytes()`](../bytes-type/). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | encoded | String | true | Base64-encoded string representing a byte array. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Bytes | FQL Bytes value. | ## [](#examples)Examples ```fql Bytes.fromBase64("SGVsbG8sIEZhdW5hIQ==") ``` The method returns an FQL [Bytes](../../../fql/types/#bytes) value: ``` Bytes("SGVsbG8sIEZhdW5hIQ==") ``` # `bytes.toBase64()` Convert an FQL [Bytes](../../../fql/types/#bytes) value to a Base64-encoded string. ## [](#signature)Signature ```fql-sig toBase64() => String ``` ## [](#description)Description Converts an FQL [Bytes](../../../fql/types/#bytes) value to a Base64-encoded string. A [Bytes](../../../fql/types/#bytes) value stores a byte array, represented as a Base64-encoded string. You can use [Bytes](../../../fql/types/#bytes) to store binary data in a Fauna database. This method is equivalent to [`bytes.toString()`](../instance-tostring/). ## [](#parameters)Parameters None. ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | Base64-encoded String. | ## [](#examples)Examples ```fql Bytes("SGVsbG8sIEZhdW5hIQ==").toBase64() ``` The method returns a [String](../../../fql/types/#string): ``` "SGVsbG8sIEZhdW5hIQ==" ``` # `bytes.toString()` Convert an FQL [Bytes](../../../fql/types/#bytes) value to a Base64-encoded string. ## [](#signature)Signature ```fql-sig toString() => String ``` ## [](#description)Description Converts an FQL [Bytes](../../../fql/types/#bytes) value to a Base64-encoded string. A [Bytes](../../../fql/types/#bytes) value stores a byte array, represented as a Base64-encoded string. You can use [Bytes](../../../fql/types/#bytes) to store binary data in a Fauna database. This method is equivalent to [`bytes.toBase64()`](../instance-tobase64/). ## [](#parameters)Parameters None. ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | Base64-encoded String. | ## [](#examples)Examples ```fql Bytes("SGVsbG8sIEZhdW5hIQ==").toString() ``` The method returns a [String](../../../fql/types/#string): ``` "SGVsbG8sIEZhdW5hIQ==" ``` # Collection | Learn: Collections | | --- | --- | --- | You add data to Fauna as JSON-like [documents](../../../learn/data-model/documents/), stored in [collections](../../../learn/data-model/collections/). ## [](#collection)`Collection` collection Fauna stores user-defined [collections](../../../learn/data-model/collections/) as documents in the `Collection` system collection. These documents have the [CollectionDef](../../fql/types/#collectiondef) type and are an FQL version of the FSL [collection schema](../../fsl/collection/). `Collection` documents have the following FQL structure: ```fql { name: "Customer", alias: "Cust", coll: Collection, ts: Time("2099-10-03T20:45:53.780Z"), computed_fields: { cart: { body: "(customer) => Order.byCustomerAndStatus(customer, \"cart\").first()", signature: "Order?" }, orders: { body: "(customer) => Order.byCustomer(customer)", signature: "Set" } }, fields: { name: { signature: "String" }, email: { signature: "String" }, address: { signature: "{ street: String, city: String, state: String, postalCode: String, country: String }" } }, migrations: [ { add_wildcard: {} } ], wildcard: "Any", history_days: 0, ttl_days: 10, document_ttls: true, indexes: { byEmail: { terms: [ { field: ".email", mva: false } ], values: [ { field: ".email", order: "desc", mva: false }, { field: ".name", order: "asc", mva: false } ], queryable: true, status: "complete" }, }, constraints: [ { unique: [ { field: ".email", mva: false } ], status: "active" } ], data: { desc: "The Customer collection definition" } } ``` | Field | Type | Read-only | Required | Description | | --- | --- | --- | --- | --- | --- | --- | | name | String | | true | Name of the collection. | | alias | String | Null | | | Alias used to reference a collection with a name that conflicts with a reserved schema name. By creating and using an alias, the resource doesn’t have to be renamed. See Schema aliasing. | | coll | String | true | | Collection name: Collection. | | ts | Time | true | | Last time the document was created or updated. | | computed_fields | Object | Null | | | FQL version of the collection schema’s computed fields. See FSL collection schema: Computed field definitions | | fields | Object | Null | | | FQL version of the collection schema’s field definitions. See FSL collection schema: Field definitions. | | migrations | Array | Null | | | FQL version of the collection schema’s migrations block. See FSL collection schema: Migrations block. | | wildcard | String | Null | | | FQL version of the collection schema’s wildcard constraint. See Wildcard constraints. | | history_days | Number | Null | | | Number of days of history to retain for collection documents. When the number of days lapses, document snapshots older than the interval are removed. Does not affect the current version of documents.If omitted or unset, defaults to 0 (retain no history). See Document history.history_days also affects events available for event feeds and event streams. See event feeds and event streams. | | ttl_days | Number | Null | | | Documents are deleted ttl_days number of days after their last write.Defaults to null (Retained documents indefinitely).See Document time-to-live (TTL). | | document_ttls | Boolean | Null | | | If true, you can write to the ttl field in the collection’s documents.document_ttls does not stop ttl-related deletions or affect ttl values set by the collection schema’s ttl_days field.See Enable or disable ttl writes. | | indexes | Object | Null | | | FQL version of the collection schema’s index definitions. See Index definitions.indexes objects include fields you can use to monitor the index build. See indexes build fields. | | constraints | Array | Null | | | FQL version of the collection schema’s check and unique constraints. See FSL collection schema: Check constraint definitions and FSL collection schema: Unique constraint definitions. | | data | { *: Any } | Null | | | Arbitrary user-defined metadata for the document. | ### [](#build)`indexes` build fields `indexes` objects include fields you can use to monitor the [index build](../../../learn/data-model/indexes/#builds): | Field | Type | Read-only | Required | Description | | --- | --- | --- | --- | --- | --- | --- | | queryable | Boolean | Null | true | | If true, the index is queryable. If false, the index is not queryable. | | status | Union of pending, active, and failed. | true | | Status of the index build. Possible values are pending, active, and failed. | ## [](#static-methods)Static methods You can use the following static methods to manage the `Collection` collection in FQL. | Method | Description | | --- | --- | --- | --- | | Collection() | Access a collection by its name. | | Collection.all() | Get a Set of all collection definitions. | | Collection.byName() | Get a collection definitions by its name. | | Collection.create() | Create a collection. | | Collection.firstWhere() | Get the first collection definition that matches a provided predicate. | | Collection.toString() | Get "Collection" as a String. | | Collection.where() | Get a Set of collection definitions that match a provided predicate. | ## [](#instance-properties)Instance properties `Collection` documents have the following properties. You access the property using an existing collection’s name. | Property | Description | | --- | --- | --- | --- | | collection.definition | Get a collection definition, represented as a Collection document with the CollectionDef type. | ## [](#instance-methods)Instance methods You can use the following instance methods to manage collection definitions, represented as `Collection` documents, in FQL. You call the methods on a [CollectionDef](../../fql/types/#collectiondef). | Method | Description | | --- | --- | --- | --- | | collectionDef.delete() | Delete a collection. | | collectionDef.exists() | Test if a collection exists. | | collectionDef.replace() | Replaces a collection definition. | | collectionDef.update() | Update a collection definition. | ## [](#name-methods)Collection name methods You can use the following collection name methods to create and manage documents within a user-defined collection using FQL. You call the methods directly on an existing collection’s name. | Method | Description | | --- | --- | --- | --- | | collection.all() | Get a Set of all documents in a collection. | | collection.byId() | Get a collection document by its document id. | | collection.create() | Create a collection document. | | collection.createData() | Create a collection document from an object that may contain metadata fields. | | collection.firstWhere() | Get the first collection document that matches a provided predicate. | | collection.where() | Get a Set of collection documents that match a provided predicate. | | collection.indexName() | Call an index as a method to get a Set of matching collection documents. | # `collection.definition` | Learn: Collections | | --- | --- | --- | Get a [collection definition](../../../../learn/data-model/collections/), represented as a [`Collection` document](../#collection) with the [CollectionDef](../../../fql/types/#collectiondef) type. ## [](#signature)Signature ```fql-sig .definition: CollectionDef ``` ## [](#description)Description A collection’s schema, represented as a [`Collection` document](../#collection) with the [CollectionDef](../../../fql/types/#collectiondef) type. `Collection` documents are FQL versions of a database’s FSL [collection schema](../../../fsl/collection/). `Collection` documents have the [CollectionDef](../../../fql/types/#collectiondef) type. See [Collections](../../../../learn/data-model/collections/). You access the `.definition` property using an existing collection’s name. ### [](#definition-properties)Definition properties You can use [dot or bracket notation](../../../fql/dot-notation/#dot-notation-field-accessor) to access specific fields in the definition. See [Access definition properties](#access). ### [](#definition-methods)Definition methods The `definition` property supports [collection instance methods](../#instance-methods). ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | CollectionDef | Definition for the collection, represented as a Collection document. | ## [](#examples)Examples ### [](#basic-example)Basic example ```fql // Get the definition for // the `Customer` collection. Customer.definition ``` ``` { name: "Customer", coll: Collection, ts: Time("2099-10-03T20:45:53.780Z"), history_days: 0, fields: { name: { signature: "String" }, email: { signature: "String" }, address: { signature: "{ street: String, city: String, state: String, postalCode: String, country: String }" } }, wildcard: "Any", computed_fields: { cart: { body: "(customer) => Order.byCustomerAndStatus(customer, \"cart\").first()", signature: "Order?" }, orders: { body: "(customer) => Order.byCustomer(customer)", signature: "Set" } }, indexes: { byEmail: { terms: [ { field: ".email", mva: false } ], values: [ { field: ".email", order: "desc", mva: false }, { field: ".name", order: "asc", mva: false } ], queryable: true, status: "complete" }, }, ttl_days: 10, constraints: [ { unique: [ { field: ".email", mva: false } ], status: "active" } ], document_ttls: true, migrations: [ { add_wildcard: {} } ] } ``` ### [](#access)Access definition properties Use [dot or bracket notation](../../../fql/dot-notation/#dot-notation-field-accessor) to access specific fields in the definition: ```fql // Access `computed_fields` definitions for // the `Customer` collection. Customer.definition.computed_fields ``` ``` // Only returns computed field definitions. { cart: { body: "(customer) => Order.byCustomerAndStatus(customer, \"cart\").first()", signature: "Order?" }, orders: { body: "(customer) => Order.byCustomer(customer)", signature: "Set" } } ``` # `Collection()` | Learn: Collections | | --- | --- | --- | Access a [collection](../../../../learn/data-model/collections/) by its name. ## [](#signature)Signature ```fql-sig Collection(collection: String) => Any ``` ## [](#description)Description Accesses a collection by its name. You can chain [collection name methods](../#name-methods) to the returned collection. ### [](#errors)Errors If you attempt to access a collection that doesn’t exist, Fauna returns a query runtime error with an `invalid_argument` [error code](../../../http/reference/errors/) and a 400 HTTP status code: ``` invalid_argument error: invalid argument `collection`: No such user collection `Foo`. at *query*:1:11 | 1 | Collection("Foo") | ^^^^^^^ | ``` ### [](#comparison-to-collectionname)Comparison to `` Calling `Collection()` is similar to accessing `` directly, except: * `Collection()` returns an [Any](../../../fql/types/#any) value * `` is a [Collection](../../../fql/types/#collection) value This difference only affects static typing, not runtime behavior. Both `Collection()` and `` support the same [collection name methods](../#name-methods). In most cases, you should use ``. However, `Collection()` is useful if you need to iterate through a list of collection names. See [Dynamically specify a collection name](#iterate). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | collection | String | true | Collection name. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Any | The named collection. You can chain collection name methods to the returned collection. | ## [](#examples)Examples `Collection()` returns an [Any](../../../fql/types/#any) value. For example: ```fql Collection("Product") ``` Returns an [Any](../../../fql/types/#any) value: ``` Product ``` ### [](#difference-with-collectionname)Difference with `` Accessing `` directly returns a similar value, typed as a [Collection](../../../fql/types/#collection). For example: ```fql Product ``` Returns the `Product` [Collection](../../../fql/types/#collection) value: ``` Product ``` ### [](#method-chaining)Method chaining In most cases, you’ll chain other [collection name methods](../#name-methods) to `Collection()`. For example: ```fql let produce = Category.byName("produce").first() Collection("Product").create({ name: "zebra pinata", description: "Giant Zebra Pinata", price: 23_99, stock: 0, category: produce }) ``` Returns an [Any](../../../fql/types/#any) value: ``` { id: "12345", coll: Product, ts: Time("2099-06-24T21:21:37.170Z"), name: "zebra pinata", description: "Giant Zebra Pinata", price: 2399, stock: 0, category: Category("789") } ``` ### [](#iterate)Dynamically specify a collection name Use `Collection()` to dynamically specify collection names in a query. For example, the following [user-defined function (UDF)](../../../../learn/schema/user-defined-functions/) uses `Collection()` to pass a collection name as an argument: ```fsl // Accepts a collection name as an argument. function getPriceLowtoHigh(collection) { // Uses `Collection()` to dynamically specify // the collection name. Collection(collection).sortedByPriceLowToHigh() { price, name, description } } ``` The following query calls the function: ```fql // Calls the `getPriceLowtoHigh()` UDF with // a `Product` collection argument. getPriceLowtoHigh("Product") ``` # `Collection.all()` | Learn: Collections | | --- | --- | --- | Get a Set of all [collection definitions](../../../../learn/data-model/collections/). ## [](#signature)Signature ```fql-sig Collection.all() => Set Collection.all(range: { from: Any } | { to: Any } | { from: Any, to: Any }) => Set ``` ## [](#description)Description Gets a Set containing all [collection definitions](../../../../learn/data-model/collections/), represented as [`Collection` documents](../), for the database. `Collection` documents are FQL versions of a database’s FSL [collection schema](../../../fsl/collection/). `Collection` documents have the [CollectionDef](../../../fql/types/#collectiondef) type. See [Collections](../../../../learn/data-model/collections/). To limit the Set, you can provide an optional range of [`Collection` documents](../). If this method is the last expression in a query, the first page of the [Set](../../../fql/types/#set) is returned. See [Pagination](../../../../learn/query/pagination/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | range | { from: Any } | { to: Any } | { from: Any, to: Any } | | Specifies a range of Collection documents in the form { from: start, to: end }. from and to arguments should be in the order returned by an unbounded Collection.all() call. See Range examples.The Set only includes documents in this range (inclusive). Omit from or to to run unbounded range searches.If a range is omitted, all collection definitions are returned. | ### [](#range-parameters)Range parameters | Name | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | from | Any | | Beginning of the range (inclusive). Must be an Collection document. | | to | Any | | End of the range (inclusive). Must be an Collection document. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set of Collection documents in the provided range. If a range is omitted, all collection definitions are returned.The Set is empty if:The database has no collection definitions.There are no collection definitions in the provided range.The provided range’s from value is greater than to. | ## [](#examples)Examples ### [](#range)Range examples 1. Get all collection definitions for the database: ```fql Collection.all() ``` ``` { data: [ { name: "Customer", coll: Collection, ts: Time("2099-07-30T22:04:31.325Z"), ... }, { name: "Product", coll: Collection, ts: Time("2099-07-30T22:04:31.325Z"), ... }, { name: "Category", coll: Collection, ts: Time("2099-07-30T22:04:31.325Z"), ... }, { name: "Order", coll: Collection, ts: Time("2099-07-30T22:04:31.325Z"), ... }, { name: "OrderItem", coll: Collection, ts: Time("2099-07-30T22:04:31.325Z"), ... } ] } ``` 2. Given the previous Set, get all collection definitions starting with `Order`: ```fql Collection.all({ from: Collection.byName("Category") }) ``` ``` { data: [ { name: "Category", coll: Collection, ts: Time("2099-07-30T22:04:31.325Z"), ... }, { name: "Order", coll: Collection, ts: Time("2099-07-30T22:08:57.650Z"), ... }, { name: "OrderItem", coll: Collection, ts: Time("2099-07-30T22:08:57.650Z"), ... } ] } ``` 3. Get the Set of collection definitions from `Product` to `Manager`, inclusive: ```fql Collection.all({ from: Collection.byName("Category"), to: Collection.byName("Order") }) ``` ``` { data: [ { name: "Category", coll: Collection, ts: Time("2099-07-30T22:04:31.325Z"), ... }, { name: "Order", coll: Collection, ts: Time("2099-07-30T22:08:57.650Z"), ... } ] } ``` # `Collection.byName()` | Learn: Collections | | --- | --- | --- | Get a [collection definitions](../../../../learn/data-model/collections/) by its name. ## [](#signature)Signature ```fql-sig Collection.byName(name: String) => NamedRef ``` ## [](#description)Description Gets a [collection definitions](../../../../learn/data-model/collections/), represented as a [`Collection` document](../), by its name. `Collection` documents are FQL versions of a database’s FSL [collection schema](../../../fsl/collection/). `Collection` documents have the [CollectionDef](../../../fql/types/#collectiondef) type. See [Collections](../../../../learn/data-model/collections/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | name | String | true | name of a Collection document. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | NamedRef | Resolved reference to a Collection document. Can resolve to an existing document or a NullDoc. | ## [](#examples)Examples ```fql Collection.byName("Product") ``` ``` { name: "Product", coll: Collection, ts: Time("2099-07-30T22:55:21.520Z"), history_days: 0, constraints: [ { unique: [ { field: ".name", mva: false } ], status: "active" }, { check: { name: "stockIsValid", body: "(product) => product.stock >= 0" } }, { check: { name: "priceIsValid", body: "(product) => product.price > 0" } } ], indexes: { byCategory: { terms: [ { field: ".category", mva: false } ], queryable: true, status: "complete" }, sortedByCategory: { values: [ { field: ".category", order: "asc", mva: false } ], queryable: true, status: "complete" }, byName: { terms: [ { field: ".name", mva: false } ], queryable: true, status: "complete" }, sortedByPriceLowToHigh: { values: [ { field: ".price", order: "asc", mva: false }, { field: ".name", order: "asc", mva: false }, { field: ".description", order: "asc", mva: false }, { field: ".stock", order: "asc", mva: false } ], queryable: true, status: "complete" } }, fields: { name: { signature: "String" }, description: { signature: "String" }, price: { signature: "Int" }, category: { signature: "Ref" }, stock: { signature: "Int" } } } ``` # `Collection.create()` | Learn: Collections | | --- | --- | --- | Create a [collection](../../../../learn/data-model/collections/). ## [](#signature)Signature ```fql-sig Collection.create(data: { name: String, alias: String | Null, computed_fields: { *: { body: String, signature: String | Null } } | Null, fields: { *: { signature: String, default: String | Null } } | Null, migrations: Array<{ backfill: { field: String, value: String } } | { drop: { field: String } } | { split: { field: String, to: Array } } | { move: { field: String, to: String } } | { add: { field: String } } | { move_conflicts: { into: String } } | { move_wildcard: { into: String } } | { add_wildcard: {} }> | Null, wildcard: String | Null, history_days: Number | Null, ttl_days: Number | Null, document_ttls: Boolean | Null, indexes: { *: { terms: Array<{ field: String, mva: Boolean | Null }> | Null, values: Array<{ field: String, mva: Boolean | Null, order: "asc" | "desc" | Null }> | Null, queryable: Boolean | Null } } | Null, constraints: Array<{ unique: Array | Array<{ field: String, mva: Boolean | Null }> } | { check: { name: String, body: String } }> | Null, data: { *: Any } | Null }) => CollectionDef ``` ## [](#description)Description Creates a [collection](../../../../learn/data-model/collections/) with a provided collection definition, represented as a [`Collection` document](../). `Collection` documents are FQL versions of a database’s FSL [collection schema](../../../fsl/collection/). `Collection` documents have the [CollectionDef](../../../fql/types/#collectiondef) type. See [Collections](../../../../learn/data-model/collections/). You can’t create a collection and use it in the same query. Use separate queries instead. ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method adds a collection to the staged schema, not the active schema. If the database has no staged schema, using this method is equivalent to making an [unstaged schema change](../../../../learn/schema/manage-schema/#unstaged). Changes are applied immediately to the database’s active schema. Unstaged schema changes that trigger an [index build](../../../../learn/data-model/indexes/#builds) may result in downtime where the index is not queryable. #### [](#concurrent)Avoid concurrent schema changes Concurrent [unstaged schema changes](../../../../learn/schema/manage-schema/#unstaged) can cause [contended transactions](../../../../learn/transactions/contention/), even if the changes affect different resources. This includes unstaged changes made using: * The [Fauna CLI](../../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../../learn/schema/manage-schema/#fql) A schema change triggers a [transaction that validates the entire database schema](../../../../learn/schema/#validation). To avoid errors, do one of the following instead: * Run [staged schema changes](../../../../learn/schema/manage-schema/#staged) * Perform unstaged schema changes sequentially ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | true | Document fields for the new Collection document.For supported document fields, see Collection collection. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | CollectionDef | The new Collection document. | ## [](#examples)Examples Create a collection named `Inventory` without defining fields for the collection: ```fql Collection.create({ name: "Inventory" }) ``` ``` { name: "Inventory", coll: Collection, ts: Time("2099-02-18T20:49:36.680Z"), history_days: 0, indexes: {}, constraints: [] } ``` Fields can be added and changed with the [`document.update()`](../../document/update/) and [`document.replace()`](../../document/replace/) methods. ## [](#see-also)See also [`collectionDef.delete()`](../delete/) # `Collection.firstWhere()` | Learn: Collections | | --- | --- | --- | Get the first [collection definition](../../../../learn/data-model/collections/) that matches a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig Collection.firstWhere(pred: (CollectionDef => Boolean)) => CollectionDef | Null ``` ## [](#description)Description Gets the first [collection definition](../../../../learn/data-model/collections/), represented as a [`Collection` document](../), that matches a provided [predicate function](../../../fql/functions/#predicates). `Collection` documents are FQL versions of a database’s FSL [collection schema](../../../fsl/collection/). `Collection` documents have the [CollectionDef](../../../fql/types/#collectiondef) type. See [Collections](../../../../learn/data-model/collections/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pred | Predicate function | Yes | Anonymous predicate function that:Accepts a Collection document as its only argument. Supports shorthand-syntax.Returns a Boolean value.The method returns the first Collection document for which the predicate returns true. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | CollectionDef | First Collection document that matches the predicate. | | Null | No Collection document matches the predicate. | ## [](#examples)Examples ```fql Collection.firstWhere(.name.includes('Prod')) ``` ``` { name: "Product", coll: Collection, ts: Time("2099-04-10T14:13:05.740Z"), ... } ``` # `Collection.toString()` | Learn: Collections | | --- | --- | --- | Get `"Collection"` as a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig Collection.toString() => String ``` ## [](#description)Description Returns the name of the [`Collection` collection](../) as a [String](../../../fql/types/#string). ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | "Collection" | ## [](#examples)Examples ```fql Collection.toString() ``` ``` "Collection" ``` # `Collection.where()` | Learn: Collections | | --- | --- | --- | Get a Set of [collection definitions](../../../../learn/data-model/collections/) that match a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig Collection.where(pred: (CollectionDef => Boolean)) => Set ``` ## [](#description)Description Gets a Set of [collection definitions](../../../../learn/data-model/collections/), represented as [`Collection` documents](../), that match a provided [predicate function](../../../fql/functions/#predicates). `Collection` documents are FQL versions of a database’s FSL [collection schema](../../../fsl/collection/). `Collection` documents have the [CollectionDef](../../../fql/types/#collectiondef) type. See [Collections](../../../../learn/data-model/collections/). If this method is the last expression in a query, the first page of the `Set` is returned. See [Pagination](../../../../learn/query/pagination/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pred | Predicate function | Yes | Anonymous predicate function that:Accepts a Collection document as its only argument. Supports shorthand-syntax.Returns a Boolean value.The method returns a Set of Collection documents for which the predicate returns true. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set of Collection documents that match the predicate. If there are no matching documents, the Set is empty. | ## [](#examples)Examples ```fql Collection.where(.name.includes('Prod')) ``` ``` { data: [ { name: "Product", coll: Collection, ts: Time("2099-04-10T17:01:11.995Z"), ... } ] } ``` # `collectionDef.delete()` | Learn: Collections | | --- | --- | --- | Delete a [collection](../../../../learn/data-model/collections/). ## [](#signature)Signature ```fql-sig delete() => NullCollectionDef ``` ## [](#description)Description Deletes a [collection](../../../../learn/data-model/collections/), represented as a [`Collection` document](../). `Collection` documents are FQL versions of a database’s FSL [collection schema](../../../fsl/collection/). `Collection` documents have the [CollectionDef](../../../fql/types/#collectiondef) type. See [Collections](../../../../learn/data-model/collections/). ### [](#staged-schema)Staged schema You can’t delete a collection while a database has [staged schema](../../../../learn/schema/manage-schema/#staged). If the database has no staged schema, using this method is equivalent to making an [unstaged schema change](../../../../learn/schema/manage-schema/#unstaged). Changes are applied immediately to the database’s active schema. #### [](#concurrent)Avoid concurrent schema changes Concurrent [unstaged schema changes](../../../../learn/schema/manage-schema/#unstaged) can cause [contended transactions](../../../../learn/transactions/contention/), even if the changes affect different resources. This includes unstaged changes made using: * The [Fauna CLI](../../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../../learn/schema/manage-schema/#fql) A schema change triggers a [transaction that validates the entire database schema](../../../../learn/schema/#validation). To avoid errors, do one of the following instead: * Run [staged schema changes](../../../../learn/schema/manage-schema/#staged) * Perform unstaged schema changes sequentially ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | NullCollectionDef | Document doesn’t exist. See NullDoc. | ## [](#examples)Examples ```fql Collection.byName("Category")!.delete() ``` ``` Collection.byName("Category") /* deleted */ ``` # `collectionDef.exists()` | Learn: Collections | | --- | --- | --- | Test if a [collection](../../../../learn/data-model/collections/) exists. ## [](#signature)Signature ```fql-sig exists() => Boolean ``` ## [](#description)Description Tests if a collection definition, represented as a [`Collection` document](../), exists. `Collection` documents are FQL versions of a database’s FSL [collection schema](../../../fsl/collection/). `Collection` documents have the [CollectionDef](../../../fql/types/#collectiondef) type. See [Collections](../../../../learn/data-model/collections/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. ### [](#exists-vs-null-comparisons)`exists()` vs. null comparisons You can use either `exists()` or a null comparison (`== null` or `!= null`) to check the existence or validity of a value. For example: ```fql Collection.byName("Category").exists() // true Collection.byName("Category") != null // true ``` Key differences: * `exists()` returns an error if called on an unsupported value. * Null comparisons do not throw errors and work safely on any value. For example: ```fql // Declare an object. Objects don't support // an `exists()` method. let object = { a: "Foo", b: "Bar" } object.exists() // Returns `invalid_query` error object != null // Returns true ``` ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the Collection document exists. If false, the Collection document doesn’t exist. | ## [](#examples)Examples ```fql Collection.byName("Product").exists() ``` ``` true ``` # `collectionDef.replace()` | Learn: Collections | | --- | --- | --- | Replaces a [collection definition](../../../../learn/data-model/collections/). ## [](#signature)Signature ```fql-sig replace(data: { *: Any }) => CollectionDef ``` ## [](#description)Description Replaces all fields in a [collection definition](../../../../learn/data-model/collections/), represented as a [`Collection` document](../), with fields from a provided data object. Fields not present in the data object, excluding the `coll` and `ts` metadata fields, are removed. `Collection` documents are FQL versions of a database’s FSL [collection schema](../../../fsl/collection/). `Collection` documents have the [CollectionDef](../../../fql/types/#collectiondef) type. See [Collections](../../../../learn/data-model/collections/). ### [](#metadata-fields)Metadata fields You can’t use this method to replace the following [metadata fields](../../../../learn/data-model/documents/#meta): * `coll` * `ts` ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. You can’t rename a collection while a database has staged schema. If the database has no staged schema, using this method is equivalent to making an [unstaged schema change](../../../../learn/schema/manage-schema/#unstaged). Changes are applied immediately to the database’s active schema. Unstaged schema changes that trigger an [index build](../../../../learn/data-model/indexes/#builds) may result in downtime where the index is not queryable. #### [](#concurrent)Avoid concurrent schema changes Concurrent [unstaged schema changes](../../../../learn/schema/manage-schema/#unstaged) can cause [contended transactions](../../../../learn/transactions/contention/), even if the changes affect different resources. This includes unstaged changes made using: * The [Fauna CLI](../../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../../learn/schema/manage-schema/#fql) A schema change triggers a [transaction that validates the entire database schema](../../../../learn/schema/#validation). To avoid errors, do one of the following instead: * Run [staged schema changes](../../../../learn/schema/manage-schema/#staged) * Perform unstaged schema changes sequentially ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object containing valid Collection document fields. | true | Fields for the Collection document. Fields not present, excluding the coll and ts metadata fields, in the object are removed.For supported document fields, see Collection collection.The object can’t include the following metadata fields:collts | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | CollectionDef | Collection document with replaced fields. | ## [](#examples)Examples ```fql Collection.byName("Customer")!.replace({ name: "Customer", history_days: 0, migrations: [ { add_wildcard: {} } ], computed_fields: { cart: { body: "(customer) => Order.byCustomerAndStatus(customer, \"cart\").first()", signature: "Order?" }, orders: { body: "(customer) => Order.byCustomer(customer)", signature: "Set" } }, wildcard: "Any", constraints: [ { unique: [ { field: ".email", mva: false } ], status: "active" } ], indexes: { byEmail: { terms: [ { field: ".email", mva: false } ], queryable: true, status: "complete" } }, document_ttls: true, ttl_days: 10, fields: { name: { signature: "String" }, email: { signature: "String" }, address: { signature: "{ street: String, city: String, state: String, postalCode: String, country: String }" } } }) ``` ``` { name: "Customer", coll: Collection, ts: Time("2099-10-03T21:50:53.550Z"), migrations: [ { add_wildcard: {} } ], indexes: { byEmail: { terms: [ { field: ".email", mva: false } ], queryable: true, status: "complete" } }, constraints: [ { unique: [ { field: ".email", mva: false } ], status: "active" } ], wildcard: "Any", ttl_days: 10, computed_fields: { cart: { body: "(customer) => Order.byCustomerAndStatus(customer, \"cart\").first()", signature: "Order?" }, orders: { body: "(customer) => Order.byCustomer(customer)", signature: "Set" } }, fields: { name: { signature: "String" }, email: { signature: "String" }, address: { signature: "{ street: String, city: String, state: String, postalCode: String, country: String }" } }, history_days: 0, document_ttls: true } ``` # `collectionDef.update()` | Learn: Collections | | --- | --- | --- | Update a [collection definition](../../../../learn/data-model/collections/). ## [](#signature)Signature ```fql-sig update(data: { *: Any }) => CollectionDef ``` ## [](#description)Description Updates a [collection definition](../../../../learn/data-model/collections/), represented as a [`Collection` document](../), with fields from a provided data object. During the update, fields from the data object are copied to the document, creating new fields or updating existing fields. The operation is similar to a merge. `Collection` documents are FQL versions of a database’s FSL [collection schema](../../../fsl/collection/). `Collection` documents have the [CollectionDef](../../../fql/types/#collectiondef) type. See [Collections](../../../../learn/data-model/collections/). ### [](#nested-fields)Nested fields Fields with nested objects in the data object are merged with the identically named nested object in the document. ### [](#remove-a-field)Remove a field To remove a document field, set its value in the data object to `null`. ### [](#metadata-fields)Metadata fields You can’t use this method to insert or edit the following [metadata fields](../../../../learn/data-model/documents/#meta): * `coll` * `ts` ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. You can’t rename a collection while a database has staged schema. If the database has no staged schema, using this method is equivalent to making an [unstaged schema change](../../../../learn/schema/manage-schema/#unstaged). Changes are applied immediately to the database’s active schema. Unstaged schema changes that trigger an [index build](../../../../learn/data-model/indexes/#builds) may result in downtime where the index is not queryable. #### [](#concurrent)Avoid concurrent schema changes Concurrent [unstaged schema changes](../../../../learn/schema/manage-schema/#unstaged) can cause [contended transactions](../../../../learn/transactions/contention/), even if the changes affect different resources. This includes unstaged changes made using: * The [Fauna CLI](../../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../../learn/schema/manage-schema/#fql) A schema change triggers a [transaction that validates the entire database schema](../../../../learn/schema/#validation). To avoid errors, do one of the following instead: * Run [staged schema changes](../../../../learn/schema/manage-schema/#staged) * Perform unstaged schema changes sequentially ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | | Document fields for the Collection document.For supported document fields, see the Collection collection.The object can’t include the following metadata fields:collts | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | CollectionDef | The updated Collection document. | ## [](#examples)Examples ```fql Collection.byName("Customer")!.update({ ttl_days: 15 }) ``` ``` { name: "Customer", coll: Collection, ts: Time("2099-10-03T20:45:53.780Z"), history_days: 0, fields: { name: { signature: "String" }, email: { signature: "String" }, address: { signature: "{ street: String, city: String, state: String, postalCode: String, country: String }" } }, wildcard: "Any", computed_fields: { cart: { body: "(customer) => Order.byCustomerAndStatus(customer, \"cart\").first()", signature: "Order?" }, orders: { body: "(customer) => Order.byCustomer(customer)", signature: "Set" } }, indexes: { byEmail: { terms: [ { field: ".email", mva: false } ], values: [ { field: ".email", order: "desc", mva: false }, { field: ".name", order: "asc", mva: false } ], queryable: true, status: "complete" }, }, ttl_days: 15, constraints: [ { unique: [ { field: ".email", mva: false } ], status: "active" } ], document_ttls: true, migrations: [ { add_wildcard: {} } ] } ``` # `collection.all()` | Learn: Documents | | --- | --- | --- | Get a Set of all [documents](../../../../learn/data-model/documents/) in a collection. ## [](#signature)Signature ```fql-sig all() => Set all(range: { from: Any } | { to: Any } | { from: Any, to: Any }) => Set ``` ## [](#description)Description Gets a Set containing all documents in a collection. To limit the returned Set, you can provide an optional range. If `all()` is the last expression in a query, the first page of the Set is returned. See [Pagination](../../../../learn/query/pagination/). ### [](#built-in-index)Built-in index Fauna implements `.all()` as a built-in [collection index](../../../../learn/data-model/indexes/). The index uses ascending the [document `id`](../../../../learn/data-model/documents/#meta) as its only [index value](../../../../learn/data-model/indexes/#values). The `.all()` index has no [index terms](../../../../learn/data-model/indexes/#terms). Like all indexes, the `.all()` index reads [historical data when queried](../../../../learn/data-model/indexes/#history). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | range | { from: Any } | { to: Any } | { from: Any, to: Any } | | Specifies a range of collection documents in the form { from: start, to: end }. from and to arguments should be in the order returned by an unbounded all() call. See Range examples.The Set only includes documents in this range (inclusive). Omit from or to to run unbounded range searches.If a range is omitted, all collection documents are returned. | ### [](#range-parameters)Range parameters | Name | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | from | Any | | Beginning of the range (inclusive). Must be a document in the collection. | | to | Any | | End of the range (inclusive). Must be a document in the collection. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | All matching documents in the collection. | ## [](#examples)Examples ### [](#range)Range examples 1. First, get all documents in the collection instance: ```fql Category.all() ``` ``` { data: [ { id: "123", coll: Category, ts: Time("2099-07-30T21:56:38.130Z"), products: "hdW...", name: "party", description: "Party Supplies" }, { id: "456", coll: Category, ts: Time("2099-07-30T21:56:38.130Z"), products: "hdW...", name: "frozen", description: "Frozen Foods" }, { id: "789", coll: Category, ts: Time("2099-07-30T21:56:38.130Z"), products: "hdW...", name: "produce", description: "Fresh Produce" } ] } ``` 2. Provide a range to get all documents beginning with a given document: ```fql let frozen = Category.byName("frozen").first() Category.all({from: frozen}) ``` ``` { data: [ { id: "456", coll: Category, ts: Time("2099-07-30T21:56:38.130Z"), products: "hdW...", name: "frozen", description: "Frozen Foods" }, { id: "789", coll: Category, ts: Time("2099-07-30T21:56:38.130Z"), products: "hdW...", name: "produce", description: "Fresh Produce" } ] } ``` 3. Get all documents beginning up to a given document: ```fql let frozen = Category.byName("frozen").first() Category.all({to: frozen}) ``` ``` { data: [ { id: "123", coll: Category, ts: Time("2099-07-30T21:56:38.130Z"), products: "hdW...", name: "party", description: "Party Supplies" }, { id: "456", coll: Category, ts: Time("2099-07-30T21:56:38.130Z"), products: "hdW...", name: "frozen", description: "Frozen Foods" } ] } ``` 4. Get all documents between the given `from` and `to` range parameters (inclusive): ```fql let party = Category.byName("party").first() let frozen = Category.byName("frozen").first() Category.all({from: party, to: frozen}) ``` ``` { data: [ { id: "123", coll: Category, ts: Time("2099-07-30T21:56:38.130Z"), products: "hdW...", name: "party", description: "Party Supplies" }, { id: "456", coll: Category, ts: Time("2099-07-30T21:56:38.130Z"), products: "hdW...", name: "frozen", description: "Frozen Foods" } ] } ``` # `collection.byId()` | Learn: Documents | | --- | --- | --- | Get a collection document by its [document `id`](../../../../learn/data-model/documents/#meta). ## [](#signature)Signature ```fql-sig byId(id: ID) => Ref ``` ## [](#description)Description Gets a [collection document](../../../../learn/data-model/documents/) by its [document `id`](../../../../learn/data-model/documents/#meta). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | id | ID | true | ID of the collection document to retrieve.The ID must be a Int or String that be coerced into a 64-bit unsigned integer in the 253-1 range. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Ref | Resolved reference to the collection document. Can resolve to an existing document or a NullDoc. | ## [](#examples)Examples Get document by ID: ```fql Product.byId("111") ``` ``` { id: "111", coll: Product, ts: Time("2099-06-25T20:23:49.070Z"), name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698, stock: 100, category: Category("123") } ``` Get document by [Int](../../../fql/types/#integer) value assigned to `id`: ```fql let id = 12345 let produce = Category.byName("produce").first() Product.create({ id: id, name: "key lime", description: "Organic, 1 ct", price: 79, category: produce, stock: 2000 }) Product.byId(id) ``` ``` { id: "12345", coll: Product, ts: Time("2099-10-25T16:52:26.510Z"), name: "key lime", description: "Organic, 1 ct", price: 79, category: Category("789"), stock: 2000 } ``` # `collection.create()` | Learn: Documents | | --- | --- | --- | Create a [collection document](../../../../learn/data-model/documents/). ## [](#signature)Signature ```fql-sig create(data: { id: ID | Null, ttl: Time | Null, data: Null, *: Any }) => ``` ## [](#description)Description Creates a [document](../../../../learn/data-model/documents/) in the collection with the provided document fields. ### [](#reserved-fields)Reserved fields You can’t use this method to write to the following [metadata fields](../../../../learn/data-model/documents/#meta): * `coll` * `ts` * `data` ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | true | Object containing the document’s fields.For supported fields in user-defined collections, see Document fields.To create a document with a user-provided id, you must use an authentication secret with the create_with_id privilege.The object can’t include the following metadata fields:colltsdata | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | | New collection document.A document’s data type is taken from its collection’s name. For example, Product for a document in the Product collection. See Document type. | ## [](#examples)Examples ### [](#basic)Basic example ```fql Customer.create({ name: "John Doe", email: "john.doe@example.com", address: { street: "123 Main St", city: "San Francisco", state: "CA", postalCode: "12345", country: "United States" } }) ``` ``` { id: "12345", coll: Customer, ts: Time("2099-02-19T14:53:53.940Z"), cart: null, orders: "hdWCxmVPcmRlcoHKhGpieUN1c3RvbWVygcZidjD09oHNSgW6Vc/c8AIABAD2wYIaZxvNCBoz2yWAEA==", name: "John Doe", email: "john.doe@example.com", address: { street: "123 Main St", city: "San Francisco", state: "CA", postalCode: "12345", country: "United States" } } ``` ### [](#id)Create with ID You can specify a custom `id` when creating a document. The `id` must be unique within the collection: ```fql Customer.create({ id: "999", name: "Jane Doe", email: "jane.doe@example.com", address: { street: "123 Main St", city: "San Francisco", state: "CA", postalCode: "12345", country: "United States" } }) ``` If you don’t specify an `` id` ``, Fauna auto-generates one. To create documents with a custom `id`, you must have the [`create_with_id` privilege](../../../fsl/role/#privilege-actions) on the collection. ### [](#default)Default values A [field definition](../../../../learn/schema/#field-definitions) can set a [default field value](../../../fsl/field-definitions/#default) for documents in a collection: ```fsl collection Customer { // `name` accepts `String` and `Null` values. // If missing, defaults to `unknown`. name: String? = "unknown" email: String } ``` If you don’t provide a value during document creation, the document uses the default: ```fql Customer.create({ // The `name` field is missing. email: "john.doe@example.com" }) ``` ``` { id: "12345", coll: Customer, ts: Time("2099-02-19T14:53:53.940Z"), email: "john.doe@example.com", // `name` defaulted to `unknown`. name: "unknown" } ``` If you provide an explicit `null` value, the field is `null`. Fields with `null` values aren’t stored or returned. ```fql Customer.create({ // `name` is an explicit `null`. name: null, email: "jane.doe@example.com" }) ``` ``` { id: "12345", coll: Customer, ts: Time("2099-02-19T14:53:53.940Z"), // `name` is not stored or returned. email: "jane.doe@example.com" } ``` ## [](#see-also)See also [`ID()`](../../globals/id/) [`newId()`](../../globals/newid/) # `collection.createData()` | Learn: Documents | | --- | --- | --- | Create a [collection document](../../../../learn/data-model/documents/) from an object that may contain [metadata fields](../../../../learn/data-model/documents/). ## [](#signature)Signature ```fql-sig createData(data: { *: Any }) => ``` ## [](#description)Description Creates a document in the collection with user-provided document fields. If the following [metadata fields](../../../../learn/data-model/documents/) are included, they populate the document `data` [Object](../../../fql/types/#object) field: * `id` * `ts` * `ttl` * `data`. Otherwise, the `data` field isn’t instantiated. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | true | Object containing the document’s fields.For supported fields in user-defined collections, see Document fields.To create a document with a user-provided id, you must use an authentication secret with the create_with_id privilege.Fields with keys that match metadata fields are moved to the data field. See Document fields that populate the data field. | ### [](#data)Document fields that populate the `data` field | Name | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | ttl | Time | | Time-to-live (TTL) for the document. Only present if set. If not present or set to null, the document persists indefinitely. | | id | ID | | ID for the document. The ID is a string-encoded, 64-bit unsigned integer in the 253-1 range. The ID is unique within the collection.IDs are assigned at document creation. To create a document with a user-provided id using collection.createData(), you must use a secret with the create_with_id privilege. If not provided, Fauna generates the id. | | | Any supported data type | | User-defined document field. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | | New collection document.A document’s data type is taken from its collection’s name. For example, Product for a document in the Product collection. See Document type. | ## [](#examples)Examples Create a document with the `id` and `coll` [metadata fields](../../../../learn/data-model/documents/): ```fql Customer.createData({ id: 12345, coll: "Person", name: "John Doe", email: "john.doe@example.com", address: { street: "123 Main St", city: "San Francisco", state: "CA", postalCode: "12345", country: "United States" } }) ``` ``` { id: "412999820218728960", coll: Customer, ts: Time("2099-07-30T22:04:39.400Z"), cart: null, orders: "hdW...", name: "John Doe", email: "john.doe@example.com", address: { street: "123 Main St", city: "San Francisco", state: "CA", postalCode: "12345", country: "United States" }, data: { coll: "Person", id: 12345 } } ``` `createData()` treats any metadata field as a document field and nests it in the document’s `data` property. # `collection.firstWhere()` | Learn: Documents | | --- | --- | --- | Get the first [collection document](../../../../learn/data-model/documents/) that matches a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig firstWhere(pred: ( => Boolean)) => | Null ``` ## [](#description)Description Gets the first collection document that matches a provided [predicate function](../../../fql/functions/#predicates). Performance hint: `collection_scan` Queries that call this method emit a [performance hint](../../../http/reference/query-summary/#perf), if enabled. For example: ``` performance_hint: collection_scan - Using firstWhere() on collection Product can cause a read of every document. See https://docs.faunadb.org/performance_hint/collection_scan. at *query*:1:19 | 1 | Product.firstWhere(.name == "limes") | ^^^^^^^^^^^^^^^^^^ | ``` To address the hint, create an [index definition with a `terms`](../../../../learn/data-model/indexes/#exact-match) to look up matching documents instead. For example: ```fsl collection Product { ... // Adds `name` as an index term index byName { terms [.name] values [.name, .price, .description, .stock] } ... } ``` Then call the index in your query. Pass an argument for each term in the index definition. To avoid other performance hints, only [project](../../../fql/projection/) or [map](../../set/map/#project) field values covered by the index definition’s [`values`](../../../../learn/data-model/indexes/#sort-documents): ```fql // Get the first product named "limes" Product.byName("limes").first() { name, price, description, stock } ``` ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pred | Predicate function | Yes | Anonymous predicate function that:Accepts a collection document as its only argument. Supports shorthand-syntax.Returns a Boolean value.The method returns the first collection document for which the predicate returns true. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | | First collection document that matches the predicate.A document’s data type is taken from its collection’s name. For example, Product for a document in the Product collection. See Document type. | | Null | Document doesn’t exist. Returned when no collection document matches the predicate. | ## [](#examples)Examples ```fql Product.firstWhere(.stock < 20) ``` ``` { id: "999", coll: Product, ts: Time("2099-07-30T21:56:38.130Z"), name: "taco pinata", description: "Giant Taco Pinata", price: 2399, stock: 10, category: Category("123") } ``` # `collection.where()` | Learn: Documents | | --- | --- | --- | Get a Set of [collection documents](../../../../learn/data-model/documents/) that match a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig where(pred: ( => Boolean)) => Set<> ``` ## [](#description)Description Gets a Set of collection documents that match a provided [predicate function](../../../fql/functions/#predicates). If this method is the last value in a query, the first page of the Set is returned. See [Pagination](../../../../learn/query/pagination/). Performance hint: `collection_scan` Queries that call this method emit a [performance hint](../../../http/reference/query-summary/#perf), if enabled. For example: ``` performance_hint: collection_scan - Using where() on collection Product can cause a read of every document. See https://docs.faunadb.org/performance_hint/collection_scan. at *query*:1:14 | 1 | Product.where(.name == "limes") | ^^^^^^^^^^^^^^^^^^ | ``` To address the hint, create an [index definition with a `terms`](../../../../learn/data-model/indexes/#exact-match) to look up matching documents instead. For example: ```fsl collection Product { ... // Adds `name` as an index term index byName { terms [.name] values [.name, .price, .description, .stock] } ... } ``` Then call the index in your query. Pass an argument for each term in the index definition. To avoid other performance hints, only [project](../../../fql/projection/) or [map](../../set/map/#project) field values covered by the index definition’s [`values`](../../../../learn/data-model/indexes/#sort-documents): ```fql // Get products named "limes" Product.byName("limes") { name, price, description, stock } ``` ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pred | Predicate function | Yes | Anonymous predicate function that:* Accepts a collection document as its only argument. Supports shorthand-syntax. * Returns a Boolean value.The method returns collection documents for which the predicate returns true. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set<\> | Set of collection documents that match the predicate. If there are no matching documents, the Set is empty.A document’s data type is taken from its collection’s name. For example, Product for a document in the Product collection. See Document type. | ## [](#examples)Examples ```fql Product.where(.stock < 20) ``` ``` { data: [ { id: "999", coll: Product, ts: Time("2099-07-30T21:56:38.130Z"), name: "taco pinata", description: "Giant Taco Pinata", price: 2399, stock: 10, category: Category("123") } ] } ``` # `collection.indexName()` | Learn: Indexes | | --- | --- | --- | Call an [index](../../../../learn/data-model/indexes/) as a method to get a Set of matching collection documents. ## [](#signature)Signature ```fql-sig () => Set<> (term: B, ... ) => Set<> (term: B, ..., range: { from: Any } | { to: Any } | { from: Any, to: Any }) => Set<> ``` ## [](#description)Description Calls an [index](../../../../learn/data-model/indexes/) by its name as a method to get a Set of matching collection documents. An index stores, or covers, specific document field values for quick retrieval. You can use indexes to filter and sort a collection’s documents in a performant way. The call returns: * Documents with terms that exactly match the provided `term` arguments (if any) * Documents within the specified range of `values` (if provided) * All indexed documents sorted by the index `values` if no arguments are provided ### [](#index-definitions)Index definitions You define an index as part of a [collection schema](../../../fsl/collection/). See [FSL collection schema: Index definitions](../../../fsl/indexes/). You can only call an index that is part of the database’s active schema. ## [](#missing-or-null-values)Missing or null values * **Terms:** If an index definition contains terms, Fauna doesn’t index a document if all its index terms are missing or otherwise evaluate to null. This applies even if the document contains index values. * **Values:** If an index definition contains only values, Fauna indexes all documents in the collection, regardless of whether the document’s index values are missing or otherwise null. ## [](#covered-queries)Covered queries If you only [project](../../../fql/projection/) fields that are covered by the index terms or values, Fauna can retrieve the data directly from the index without reading the documents. These are called "covered queries" and are more performant. A query is uncovered if: * It projects fields not included in the index definition. * It has no projection, which means it returns entire documents. Uncovered queries require additional document reads and are typically more expensive. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | term | Generic | true | Arguments that exactly match the index terms defined in the index definition. Must be provided in the same order as defined in the index definition. Required if the index definition includes terms.Use a comma to separate term arguments when passing multiple terms. | | range | Object | | Specifies a range of values to match in the form { from: start, to: end } where:from: The start of the range (inclusive)to: The end of the range (inclusive)Both from and to can be single values or Arrays of values matching the index values in the index definition.When using Arrays, values are compared in order:Documents are first compared against the first Array element.If documents have matching first values, they are compared against subsequent Array elements.For range searches on indexes with descending order values, pass the higher value in from.Omit from or to to run unbounded range searches. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set of matching collection documents. Results are sorted according to the index values defined in the index definition.A document’s data type is taken from its collection’s name. For example, Product for a document in the Product collection. See Document type. | ## [](#examples)Examples ### [](#basic-example)Basic example You define an index in an FSL [collection schema](../../../fsl/collection/): ```fsl collection Product { ... index byName { terms [.name] } ... } ``` Once the index is [built](../../../../learn/data-model/indexes/#builds), you call it as a method on the collection: ```fql // Call the `byName()` index to fet `Product` collection // documents with a `name` value of `limes`. Values must // match exactly. Product.byName("limes") ``` The call returns a Set of matching collection documents. ### [](#use-index-terms-for-exact-match-searches)Use index terms for exact match searches You can use index terms to run exact match searches on document field values. The following index definition includes `name` as an index term: ```fsl collection Product { ... index byName { terms [.name] } ... } ``` When you call the index, you must pass an argument for each term in the index definition. ```fql // Get products named "limes" Product.byName("limes") ``` The call returns a Set of `Product` collection documents with a `name` of `limes`. ### [](#pass-multi-terms)Pass multiple index terms The following index definition includes two index terms: ```fsl collection Customer { ... index byName { terms [.firstName, .lastName] } } ``` In an index call, use a comma to separate term arguments. Provide arguments in the same field order used in the index definition. ```fql // Get customers named "Alice Appleseed" Customer.byName("Alice", "Appleseed") ``` The call returns a Set of matching collection documents. ### [](#use-index-values-for-sorting-and-range-searches)Use index values for sorting and range searches You can use index values to sort a collection’s documents. You can also use index values for range searches. ### [](#sort-documents)Sort documents The following index definition includes several index values: ```fsl collection Product { ... index sortedByPriceLowToHigh { values [.price, .name, .description] } } ``` Call the `sortedByPriceLowToHigh()` index with no arguments to return `Product` documents sorted by: * Ascending `price`, then …​ * Ascending `name`, then …​ * Ascending `description`, then …​ * Ascending `id` (default) ```fql // Get products by ascending price, name, and description Product.sortedByPriceLowToHigh() ``` ### [](#sort-in-descending-order)Sort in descending order By default, index values sort results in ascending order. To use descending order, use `desc()` in the index definition: ```fsl collection Product { ... index sortedByPriceHighToLow { values [desc(.price), .name, .description] } ... } ``` Call the index with no arguments to return `Product` documents sorted by: * Descending `price`, then …​ * Ascending `name`, then …​ * Ascending `description`, then …​ * Ascending `id` (default) ```fql // Get products by descending price, // ascending name, and ascending description Product.sortedByPriceHighToLow() ``` ### [](#run-a-range-search)Run a range search You can also use index values for range searches. The following index definition includes several index values: ```fsl collection Product { ... index sortedByPriceLowToHigh { values [.price, .name, .description] } } ``` The index specifies `price` as its first value. The following query passes an argument to run a range search on `price`: ```fql // Get products with a price between // 20_00 (inclusive) and 30_00 (inclusive) Product.sortedByPriceLowToHigh({ from: 20_00, to: 30_00 }) ``` If an index value uses descending order, pass the higher value in `from`: ```fql // Get products with a price between // 20_00 (inclusive) and 30_00 (inclusive) in desc order Product.sortedByPriceHighToLow({ from: 30_00, to: 20_00 }) ``` Omit `from` or `to` to run unbounded range searches: ```fql // Get products with a price greater than or equal to 20_00 Product.sortedByPriceLowToHigh({ from: 20_00 }) // Get products with a price less than or equal to 30_00 Product.sortedByPriceLowToHigh({ to: 30_00 }) ``` ### [](#pass-multi-values)Pass multiple index values Use an Array to pass multiple value arguments. Pass the arguments in the same field order used in the index definition. ```fql Product.sortedByPriceLowToHigh({ from: [ 20_00, "l" ], to: [ 30_00, "z" ] }) ``` The index returns any document that matches the first value in the `from` and `to` Arrays. If matching documents have the same values, they are compared against the next Array element value, and so on. For example, the `Product` collection’s `sortedByPriceLowToHigh()` index covers the `price` and `name` fields as index values. The `Product` collection contains two documents: | Document | price | name | | --- | --- | --- | --- | --- | | Doc1 | 4_99 | pizza | | Doc2 | 6_98 | cups | The following query returns both Doc1 and Doc2, in addition to other matching documents: ```fql Product.sortedByPriceLowToHigh({ from: [4_99, "p"] }) ``` The first value (`4_99` and `6_98`) of each document matches the first value (`4_99`) of the `from` Array. Later, you update the document values to: | Document | price | name | | --- | --- | --- | --- | --- | | Doc1 | 4_99 | pizza | | Doc2 | 4_99 | cups | The following query no longer returns Doc2: ```fql Product.sortedByPriceLowToHigh({ from: [4_99, "p"] }) ``` Although the first value (`4_99`) in both documents matches the first value in the `from` Array, the second value (`cups`) in Doc2 doesn’t match the second value (`p`) of the `from` Array. ### [](#run-a-range-search-on-id)Run a range search on `id` All indexes implicitly include an ascending document `id` as the index’s last value. If you intend to run range searches on `id`, we recommend you explicitly include an ascending `id` as the last index value in the index definition, even if you have an otherwise identical index. For example, the following `sortByStock()` and `sortByStockandId()` indexes have the same values: ```fsl collection Product { ... index sortByStock { values [.stock] } index sortByStockandId { values [.stock, .id] } ... } ``` Although it’s not explicitly listed, `sortByStock()` implicitly includes an ascending `id` as its last value. To reduce your costs, Fauna only builds the `sortByStock()` index. When a query calls the `sortByStockandId()` index, Fauna uses the `sortByStock()` index behind the scenes. `sortByStockandId()` only acts as a [virtual index](../../../../learn/data-model/indexes/#virtual-indexes) and isn’t materialized. ### [](#pass-terms-and-values)Pass terms and values If an index has both terms and values, you can run an exact match search on documents in a provided range. The following index definition includes `name` as an index term and `stock` as an index value: ```fsl collection Product { ... index byName { terms [.name] values [.stock] } ... } ``` When you call the index, you must provide a term and can specify an optional range: ```fql // Get products named "donkeypinata" // with a stock between 10 (inclusive) and 50 (inclusive) Product.byName("donkey pinata", { from: 10, to: 50 }) ``` ### [](#covered-queries-2)Covered queries If you [project](../../../fql/projection/) or [map](../../set/map/#project) an index’s covered term or value fields, Fauna gets the field values from the index. The following index definition includes several index values: ```fsl collection Product { ... index sortedByPriceLowToHigh { values [.price, .name, .description] } } ``` The following is a covered query: ```fql // This is a covered query. // `name`, `description`, and `price` are values // in the `sortedByPriceLowToHigh()` index definition. Product.sortedByPriceLowToHigh() { name, description, price } ``` If the projection contains an uncovered field, Fauna must retrieve the field values from the documents. This is an uncovered query: ```fql // This is an uncovered query. // `stock` is not one of the terms or values // in the `sortedByPriceLowToHigh()` index definition. Product.sortedByPriceLowToHigh() { name, stock } ``` Performance hint: `non_covered_document_read` Uncovered queries emit a [performance hint](../../../http/reference/query-summary/#perf), if enabled. For example: ``` performance_hint: non_covered_document_read - .stock is not covered by the Product.sortedByPriceLowToHigh index. See https://docs.faunadb.org/performance_hint/non_covered_document_read. at *query*:1:42 | 1 | Product.sortedByPriceLowToHigh() { name, stock } | ^^^^^ | ``` Covered queries are typically faster and less expensive than uncovered queries, which require document reads. If you frequently run uncovered queries, consider adding the uncovered fields to the index definition’s [`values`](../../../../learn/data-model/indexes/#sort-documents). For example: ```fsl collection Product { ... // Adds the `stock` field as an index value. index sortedByPriceLowToHigh { values [.price, .name, .description, .stock] } } ``` #### [](#no-projection-or-mapping)No projection or mapping Index queries without a [projection](../../../fql/projection/) or [mapping](../../set/map/#project) are uncovered. Fauna must read each document returned in the Set. For example: ```fql // This is an uncovered query. // Queries without a projection or mapping // require a document read. Product.byName("limes") ``` Performance hint : `non_covered_document_read` If [performance hints](../../../http/reference/query-summary/#perf) are enabled, index queries without a projection or mapping emit a performance hint. For example: ``` performance_hint: non_covered_document_read - Full documents returned from Product.byName. See https://docs.faunadb.org/performance_hint/non_covered_document_read. at *query*:1:15 | 1 | Product.byName("limes") | ^^^^^^^^^ | ``` If you frequently run such queries, consider adding the uncovered fields to the index definition’s [`values`](../../../../learn/data-model/indexes/#sort-documents). For example: ```fsl collection Product { ... index byName { terms [.name] values [.price, .stock, .description] } ... } ``` Then use projection or mapping to only return the fields you need. Given the previous index definition, the following query is covered: ```fql // This is a covered query. // `price`, `stock`, and `description` are values // in the `byName()` index definition. Product.byName("limes") { price, stock, description } ``` #### [](#filter-covered-values)Filter covered values You can use [`set.where()`](../../set/where/) to filter the results of an [index call](../../../../learn/data-model/indexes/#call). If the [`set.where()`](../../set/where/) predicate only accesses fields defined in the index definition’s `terms` and `values`, the query is [covered](../../../../learn/data-model/indexes/#covered-queries). For example, given the following index definition: ```fsl collection Product { ... index byName { terms [.name] values [.price, .description] } ... } ``` The following query is covered: ```fql // Covered query. // Calls the `byName()` index. // Uses `where()` to filter the results of // the index call. The predicates only // access covered terms and values. Product.byName("limes") .where(.description.includes("Conventional")) .where(.price < 500) { name, description, price } ``` The following query is uncovered: ```fql Product.byName("limes") .where(.description.includes("Conventional")) // The `where()` predicate accesses the uncovered // `stock` field. .where(.stock < 100) .where(.price < 500) { name, description, price } ``` To cover the query, add the uncovered field to the index definition’s `values`: ```fsl collection Product { ... index byName { terms [.name] // Adds `stock` to the index's values values [.price, .description, .stock] } ... } ``` #### [](#dynamic-filtering-using-advanced-query-composition)Dynamic filtering using advanced query composition Complex applications may need to handle arbitrary combinations of search criteria. In these cases, you can use [query composition](../../../../learn/query/composition/) to dynamically apply [indexes](../../../../learn/data-model/indexes/) and [filters](../../../../learn/query/patterns/sets/#filters) to queries. The following template uses query composition to: * Automatically select the most selective index * Apply remaining criteria as filters in priority order * Support both index-based and filter-based search patterns The template uses TypeScript and the [JavaScript driver](../../../../build/drivers/js-client/). A similar approach can be used with any [Fauna client driver](../../../../build/drivers/). ```typescript /** * A Javascript object with a sorted list of indexes or filters. * * Javascript maintains key order for objects. * Sort items in the map from most to least selective. */ type QueryMap = Record Query> /** Object to represent a search argument. * * Contains the name of the index to use and the arguments * to pass to it. * * Example: * { name: "by_name", args: ["limes"] } * { name: "range_price", args: [{ from: 100, to: 500 }] } */ type SearchTerm = { name: string args: any[] } /** * Composes a query by prioritizing the most selective index and then * applying filters. * * @param default_query - The initial query to which indexes and filters are applied. * @param index_map - A map of index names to functions that generate query components. * @param filter_map - A map of filter names to functions that generate query components. * @param search_terms - An array of search terms that specify the type and arguments * for composing the query. * @returns The composed query after applying all relevant indices and filters. */ const build_search = ( default_query: Query, index_map: QueryMap, filter_map: QueryMap, search_terms: SearchTerm[] ): Query => { const _search_terms = [...search_terms] // Initialize a default query. Used if no other indexes are applicable. let query: Query = default_query // Iterate through the index map, from most to least selective. build_index_query: for (const index_name of Object.keys( index_map )) { // Iterate through each search term to check if it matches the highest priority index. for (const search_term of _search_terms) { // If a match is found, update the query. Then remove the search term from the // list and break out of the loop. if (index_name === search_term.name) { query = index_map[search_term.name](...search_term.args) _search_terms.splice(_search_terms.indexOf(search_term), 1) break build_index_query } } } // Iterate through the filter map, from most to least selective. for (const filter_name of Object.keys(filter_map)) { // Iterate through each search term to check if it matches the highest priority filter. for (const search_term of _search_terms) { // If a match is found, update the query. Then remove the search term from the list. if (filter_name === search_term.name) { const filter = filter_map[search_term.name](...search_term.args) query = fql`${query}${filter}` _search_terms.splice(_search_terms.indexOf(search_term), 1) } } } // If there are remaining search terms, you can't build the full query. if (_search_terms.length > 0) { throw new Error("Unable to build query") } return query } ``` The following example implements the template using the [Fauna Dashboard](https://dashboard.fauna.com/)'s demo data: ```typescript // Implementation of `index_map` from the template. // Sort items in the map from most to least selective. const product_index_priority_map: QueryMap = { by_order: (id: string) => fql`Order.byId(${id})!.items.map(.product!)`, by_name: (name: string) => fql`Product.byName(${name})`, by_category: (category: string) => fql`Product.byCategory(Category.byName(${category}).first()!)`, range_price: (range: { from?: number; to?: number }) => fql`Product.sortedByPriceLowToHigh(${range})`, } // Implementation of `filter_map` from the template. // Sort items in the map from most to least selective. const product_filter_map: QueryMap = { by_name: (name: string) => fql`.where(.name == ${name})`, by_category: (category: string) => fql`.where(.category == Category.byName(${category}).first()!)`, range_price: ({ from, to }: { from?: number; to?: number }) => { // Dynamically filter products by price range. if (from && to) { return fql`.where(.price >= ${from} && .price <= ${to})` } else if (from) { return fql`.where(.price >= ${from})` } else if (to) { return fql`.where(.price <= ${to})` } return fql`` }, } // Hybrid implementation of `index_map` and `filter_map` from the template. // Combines filters and indexes to compose FQL query fragments. // Sort items in the map from most to least selective. const product_filter_with_indexes_map: QueryMap = { by_name: (name: string) => fql`.where(doc => Product.byName(${name}).includes(doc))`, by_category: (category: string) => fql`.where(doc => Product.byCategory(Category.byName(${category}).first()!).includes(doc))`, range_price: (range: { from?: number; to?: number }) => fql`.where(doc => Product.sortedByPriceLowToHigh(${range}).includes(doc))`, } const order_id = (await client.query(fql`Order.all().first()!`)) .data.id const query = build_search( fql`Product.all()`, product_index_priority_map, product_filter_with_indexes_map, [ // { type: "by", name: "name", args: ["limes"] }, // { type: "by", name: "category", args: ["produce"] }, { type: "range", name: "price", args: [{ to: 1000 }] }, { type: "by", name: "order", args: [order_id] }, ] ) const res = await client.query(query) ``` #### [](#null-values-are-uncovered)Null values are uncovered Missing or `null` field values are not [stored or covered by an index](../../../../learn/data-model/indexes/#covered-queries), even if the field is listed as one of the `values` in the index’s definition. [Projecting](../../../fql/projection/) or [mapping](../../set/map/#project) a field with a `null` value requires a document read. For example, the following `byName()` index definition includes the `description` field as an index value: ```fsl collection Product { ... index byName { terms [.name] values [.price, .description] } } ``` The following query creates a document that omits the `description` field, which is equivalent to a `null` value for the field: ```fql Product.create({ name: "limes", price: 2_99 // The `description` field is omitted (effectively `null`). }) ``` If you use `byName()` to retrieve the indexed `name`, `price`, and `description` field values, the query is uncovered. A document read is required to retrieve the `null` value of the `description` field. ```fql Product.byName("limes") { name, price, // Projects the `description` field. description } ``` ``` { data: [ { name: "limes", price: 299, // Retrieving the `description` field's `null` value // requires a document read. description: null } ] } ``` # Credential | Learn: Credentials | | --- | --- | --- | A [credential](../../../learn/security/tokens/#credentials) associates a password with an [identity document](../../../learn/security/tokens/#identity-document). You can use the credential and password to create an [authentication token](../../../learn/security/tokens/) for an end user, system, or other identity. ## [](#collection)`Credential` collection Fauna stores credentials as documents in the `Credential` system collection. You can also access this collection using the `Credentials` alias. `Credential` documents have the following FQL structure: ``` { id: "401328088768577609", coll: Credential, ts: Time("2099-06-21T18:39:00.735Z"), document: Customer("401328088729780297") password: "sekret" data: { desc: "Credential for VIP customer" } } ``` | Field name | Type | Read-only | Required | Description | | --- | --- | --- | --- | --- | --- | --- | | id | ID | | | ID for the Credential document. The ID is a string-encoded, 64-bit unsigned integer in the 253-1 range. The ID is unique within the collection.IDs are assigned at document creation. To create a credential with a user-provided id using Credential.create(), you must use a secret with the create_with_id privilege for the Credential collection. If not provided, Fauna generates the id. | | coll | Collection | true | | Collection name: Credential. | | ts | Time | true | | Last time the document was created or updated. | | document | Ref< { *: Any } > | | true | Reference to the credential’s identity document. The identity document can be in any user-defined collection. | | password | String | Null | | | End-user password to associate with the credential’s identity document. Only provided when creating, replacing, or updating a credential.You can use the password to generate a token for the credential using credential.login().Passwords are not returned in Credential documents. | | data | { *: Any } | Null | | | Arbitrary user-defined metadata for the document. | ## [](#static-methods)Static methods You can use the following static methods to manage the `Credential` collection in FQL. | Method | Description | | --- | --- | --- | --- | | Credential.all() | Get a Set of all credentials. | | Credential.byDocument() | Get a credential by its identity document. | | Credential.byId() | Get a credential by its document id. | | Credential.create() | Create a credential. | | Credential.firstWhere() | Get the first credential that matches a provided predicate. | | Credential.toString() | Get "Credential" as a String. | | Credential.where() | Get a Set of credentials that match a provided predicate. | ## [](#instance-methods)Instance methods You can use the following instance methods to manage specific `Credential` documents in FQL. | Method | Description | | --- | --- | --- | --- | | credential.delete() | Delete a credential. | | credential.exists() | Test if a credential exists. | | credential.login() | Create a token for a provided credential and its password. | | credential.replace() | Replace a credential. | | credential.update() | Update a credential. | | credential.verify() | Test whether a provided password is valid for a credential. | # `Credential.all()` | Learn: Credentials | | --- | --- | --- | Get a Set of all [credentials](../../../../learn/security/tokens/#credentials). ## [](#signature)Signature ```fql-sig Credential.all() => Set Credential.all(range: { from: Any } | { to: Any } | { from: Any, to: Any }) => Set ``` ## [](#description)Description Gets a Set containing all [credentials](../../../../learn/security/tokens/#credentials), represented as [`Credential` documents](../), for the database. To limit the returned Set, you can provide an optional range. If this method is the last expression in a query, the first page of the Set is returned. See [Pagination](../../../../learn/query/pagination/). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | range | { from: Any } | { to: Any } | { from: Any, to: Any } | | Specifies a range of Credential documents in the form { from: start, to: end }. from and to arguments should be in the order returned by an unbounded Credential.all() call. See Range examples.The Set only includes documents in this range (inclusive). Omit from or to to run unbounded range searches.If a range is omitted, all credentials are returned. | ### [](#range-parameters)Range parameters | Name | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | from | Any | | Beginning of the range (inclusive). Must be an Credential document. | | to | Any | | End of the range (inclusive). Must be an Credential document. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set of Credential documents in the provided range. If a range is omitted, all credentials are returned.The Set is empty if:The database has no credentials.There are no credentials in the provided range.The provided range’s from value is greater than to. | ## [](#examples)Examples ### [](#range)Range examples 1. Get all credentials for the database: ```fql Credential.all() ``` ``` { data: [ { id: "401670531158376525", coll: Credential, ts: Time("2099-06-25T13:21:59.270Z"), document: Customer("111") }, { id: "401670531164667981", coll: Credential, ts: Time("2099-06-25T13:21:59.270Z"), document: Customer("222") }, { id: "401670531170959437", coll: Credential, ts: Time("2099-06-25T13:21:59.270Z"), document: Customer("333") } ] } ``` 2. Given the previous Set, get all credentials starting with the credential for `Customer("222")` (inclusive): ```fql Credential.all({ from: Credential.byDocument(Customer.byId("222")) }) ``` ``` { data: [ { id: "401670531164667981", coll: Credential, ts: Time("2099-06-25T13:21:59.270Z"), document: Customer("222") }, { id: "401670531170959437", coll: Credential, ts: Time("2099-06-25T13:21:59.270Z"), document: Customer("333") } ] } ``` 3. Get a Set of credentials from the credential for `Customer("111")` (inclusive) to the credential for `Customer("222")` (inclusive): ```fql Credential.all({ from: Credential.byDocument(Customer.byId("111")), to: Credential.byDocument(Customer.byId("222")) }) ``` ``` { data: [ { id: "401670531158376525", coll: Credential, ts: Time("2099-06-25T13:21:59.270Z"), document: Customer("111") }, { id: "401670531164667981", coll: Credential, ts: Time("2099-06-25T13:21:59.270Z"), document: Customer("222") } ] } ``` 4. Get a Set of credentials up to the credential for `Customer("222")` (inclusive): ```fql Credential.all({ to: Credential.byDocument(Customer.byId("222")) }) ``` ``` { data: [ { id: "401670531158376525", coll: Credential, ts: Time("2099-06-25T13:21:59.270Z"), document: Customer("111") }, { id: "401670531164667981", coll: Credential, ts: Time("2099-06-25T13:21:59.270Z"), document: Customer("222") } ] } ``` # `Credential.byDocument()` | Learn: Credentials | | --- | --- | --- | Get a [credential](../../../../learn/security/tokens/#credentials) by its [identity document](../../../../learn/security/tokens/#identity-document). ## [](#signature)Signature ```fql-sig Credential.byDocument(document: { *: Any } | Null) => Ref ``` ## [](#description)Description Gets a [credential](../../../../learn/security/tokens/#credentials), represented as a [`Credential` document](../), by its [identity document](../../../../learn/security/tokens/#identity-document). A credential associates a password with an [identity document](../../../../learn/security/tokens/#identity-document). You can use credentials and the [`credential.login()`](../login/) method to create [tokens](../../../../learn/security/tokens/) as part of an [end-user authentication system](../../../../build/tutorials/auth/). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | document | Object | true | Identity document for the credential to retrieve. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Ref | Resolved reference to a Credential document. Can resolve to an existing document or a NullDoc. | ## [](#examples)Examples ```fql let document = Customer.byId("111") Credential.byDocument(document) ``` ``` { id: "401328088781160521", coll: Credential, ts: Time("2099-06-21T18:39:00.735Z"), document: Customer("111") } ``` # `Credential.byId()` | Learn: Credentials | | --- | --- | --- | Get a [credential](../../../../learn/security/tokens/#credentials) by its [document `id`](../../../../learn/data-model/documents/#meta). ## [](#signature)Signature ```fql-sig Credential.byId(id: ID) => Ref ``` ## [](#description)Description Gets a [credential](../../../../learn/security/tokens/#credentials), represented as an [`Credential` document](../), by its [document `id`](../../../../learn/data-model/documents/#meta). A credential associates a password with an [identity document](../../../../learn/security/tokens/#identity-document). You can use credentials and the [`credential.login()`](../login/) method to create [tokens](../../../../learn/security/tokens/) as part of an [end-user authentication system](../../../../build/tutorials/auth/). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | id | String | true | ID of the Credential document to retrieve. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Ref | Resolved reference to the Credential document. Can resolve to an existing document or a NullDoc. | ## [](#examples)Examples ```fql Credential.byId("401328088781160521") ``` ``` { id: "401328088768577609", coll: Credential, ts: Time("2099-06-21T18:39:00.735Z"), document: Customer("111") } ``` # `Credential.create()` | Learn: Credentials | | --- | --- | --- | Create a [credential](../../../../learn/security/tokens/#credentials). ## [](#signature)Signature ```fql-sig Credential.create(data: { id: ID | Null, document: Ref<{ *: Any }>, password: String | Null, data: { *: Any } | Null }) => Credential ``` ## [](#description)Description Creates a [credential](../../../../learn/security/tokens/#credentials) with the provided document fields. Fauna stores credentials as documents in the [`Credential` system collection](../). A credential associates a password with an [identity document](../../../../learn/security/tokens/#identity-document). You can use credentials and the [`credential.login()`](../login/) method to create [tokens](../../../../learn/security/tokens/) as part of an [end-user authentication system](../../../../build/tutorials/auth/). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | true | Document fields for the new Credential document.For supported document fields, see Credential collection. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Credential | The new Credential document. | ## [](#examples)Examples ```fql Credential.create({ document: Customer.byId("111"), password: "sekret" }) ``` ``` { id: "401670627820306505", coll: Credential, ts: Time("2099-06-25T13:23:31.440Z"), document: Customer("111") } ``` # `Credential.firstWhere()` | Learn: Credentials | | --- | --- | --- | Get the first [credential](../../../../learn/security/tokens/#credentials) that matches a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig Credential.firstWhere(pred: (Credential => Boolean)) => Credential | Null ``` ## [](#description)Description Gets the first [credential](../../../../learn/security/tokens/#credentials), represented as a [`Credential` document](../), that matches a provided [predicate function](../../../fql/functions/#predicates). A credential associates a password with an [identity document](../../../../learn/security/tokens/#identity-document). You can use credentials and the [`credential.login()`](../login/) method to create [tokens](../../../../learn/security/tokens/) as part of an [end-user authentication system](../../../../build/tutorials/auth/). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pred | Predicate function | Yes | Anonymous predicate function that:Accepts a Credential document as its only argument. Supports shorthand-syntax.Returns a Boolean value.The method returns the first Credential document for which the predicate returns true. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Credential | First Credential document that matches the predicate. | | Null | No Credential document matches the predicate. | ## [](#examples)Examples ```fql Credential.firstWhere(.document == Customer.byId("111")) ``` ``` { id: "371153420791316514", coll: Credential, ts: Time("2099-07-24T17:05:34.890Z"), document: Customer("111") } ``` # `Credential.toString()` | Learn: Credentials | | --- | --- | --- | Get `"Credential"` as a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig Credential.toString() => String ``` ## [](#description)Description Returns the name of the [`Credential` collection](../) as a [String](../../../fql/types/#string). ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | "Credential" | ## [](#examples)Examples ```fql Credential.toString() ``` ``` "Credential" ``` # `Credential.where()` | Learn: Credentials | | --- | --- | --- | Get a Set of [credentials](../../../../learn/security/tokens/#credentials) that match a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig Credential.where(pred: (Credential => Boolean)) => Set ``` ## [](#description)Description Gets a Set of [credentials](../../../../learn/security/tokens/#credentials), represented as [`Credential` documents](../), that match a provided [predicate function](../../../fql/functions/#predicates). A credential associates a password with an [identity document](../../../../learn/security/tokens/#identity-document). You can use credentials and the [`credential.login()`](../login/) method to create [tokens](../../../../learn/security/tokens/) as part of an [end-user authentication system](../../../../build/tutorials/auth/). If `Credential.where()` is the last expression in a query, the first page of the `Set` is returned. See [Pagination](../../../../learn/query/pagination/). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pred | Predicate function | Yes | Anonymous predicate function that:Accepts an Credential document as its only argument. Supports shorthand-syntax.Returns a Boolean value.The method returns a Set of Credential documents for which the predicate returns true. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set of Credential documents that match the predicate. If there are no matching documents, the Set is empty. | ## [](#examples)Examples ```fql Credential.where(.document == Customer.byId("111")) ``` ``` { data: [ { id: "412654807560487424", coll: Credential, ts: Time("2099-08-11T04:25:08.950Z"), document: Customer("111") } ] } ``` # `credential.delete()` | Learn: Credentials | | --- | --- | --- | Delete a credential. ## [](#signature)Signature ```fql-sig delete() => NullCredential ``` ## [](#description)Description Deletes a [credential](../../../../learn/security/tokens/#credentials), represented as a [`Credential` document](../). A credential associates a password with an [identity document](../../../../learn/security/tokens/#identity-document). You can use credentials and the [`credential.login()`](../login/) method to create [tokens](../../../../learn/security/tokens/) as part of an [end-user authentication system](../../../../build/tutorials/auth/). ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | NullCredential | Document doesn’t exist. See NullDoc. | ## [](#examples)Examples ```fql Credential.byId("401670627820306505")!.delete() ``` ``` Credential("401670627820306505") /* deleted */ ``` # `credential.exists()` | Learn: Credentials | | --- | --- | --- | Test if a [credential](../../../../learn/security/tokens/#credentials) exists. ## [](#signature)Signature ```fql-sig exists() => Boolean ``` ## [](#description)Description Tests if a [credential](../../../../learn/security/tokens/#credentials), represented as an [`Credential` document](../), exists. A credential associates a password with an [identity document](../../../../learn/security/tokens/#identity-document). You can use credentials and the [`credential.login()`](../login/) method to create [tokens](../../../../learn/security/tokens/) as part of an [end-user authentication system](../../../../build/tutorials/auth/). ### [](#exists-vs-null-comparisons)`exists()` vs. null comparisons You can use either `exists()` or a null comparison (`== null` or `!= null`) to check the existence or validity of a value. For example: ```fql Credential.byId("12345").exists() // true Credential.byId("12345") != null // true ``` Key differences: * `exists()` returns an error if called on an unsupported value. * Null comparisons do not throw errors and work safely on any value. For example: ```fql // Declare an object. Objects don't support // an `exists()` method. let object = { a: "Foo", b: "Bar" } object.exists() // Returns `invalid_query` error object != null // Returns true ``` ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the Credential document exists. If false, the Credential document doesn’t exist. | ## [](#examples)Examples ```fql Credential.byId("12345").exists() ``` ``` true ``` # `credential.login()` | Learn: Credentials | | --- | --- | --- | Create a [token](../../../../learn/security/tokens/) for a provided [credential](../../../../learn/security/tokens/#credentials) and its password. ## [](#signature)Signature ```fql-sig login(secret: String) => Token login(secret: String, ttl: Time) => Token ``` ## [](#description)Description The `login()` method authenticates an identity in Fauna by providing the password for a `Credential` document. Attempts to login with an incorrect password result in an error. Call [`credential.update()`](../update/) to set a new password. This method creates a token when it authenticates an identity. ### [](#required-privileges)Required privileges To call `login()` in a query, your access token must have a role with the `create` privilege for the `Token` system collection. For example: ```fsl role manager { ... privileges Token { create } } ``` The built-in `admin` and `server` roles have this privilege. User-defined functions (UDFs) can be assigned an optional role. If assigned, this role supersedes the access token’s privileges. If a UDF includes `login()`, the UDF’s role must have the `create` privilege for the `Token` collection. ### [](#multiple-tokens)Multiple tokens If you call this method multiple times, it creates multiple tokens. This is because an identity may have multiple tokens that can access multiple devices simultaneously. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | secret | String | true | Credential document password. | | ttl | Time | | Timestamp indicating a document lifespan. When the ttl is reached, Fauna removes it. If ttl isn’t set, its default value is null, which causes the document to persist indefinitely or until deleted. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Token | Token, which includes the token’s secret. | ## [](#examples)Examples The following simplified sequence creates a user and associates credentials with the user, which can then be used to log in. 1. Create a user in the example `Customer` collection: ```fql Customer.create({ name: "John Doe", email: "john.doe@example.com", address: { street: "123 Main St", city: "San Francisco", state: "CA", postalCode: "12345", country: "United States" } }) ``` ``` { id: "999", coll: Customer, ts: Time("2099-07-31T13:16:15.040Z"), cart: null, orders: "hdW...", name: "John Doe", email: "john.doe@example.com", address: { street: "123 Main St", city: "San Francisco", state: "CA", postalCode: "12345", country: "United States" } } ``` 2. Create a user credential, including the password: ```fql Credential.create({ document: Customer.byId("999"), password: "sekret" }) ``` ``` { id: "412654692679549440", coll: Credential, ts: Time("2099-06-25T13:27:23.170Z"), document: Customer("999") } ``` 3. Log in with a password: ```fql let document = Customer.byId("999") Credential.byDocument(document)?.login("sekret") ``` ``` { id: "412654692933304832", coll: Token, ts: Time("2099-07-31T13:17:46.900Z"), document: Customer("111"), secret: "fn..." } ``` # `credential.replace()` | Learn: Credentials | | --- | --- | --- | Replace a [credential](../../../../learn/security/tokens/#credentials). ## [](#signature)Signature ```fql-sig replace(data: { *: Any }) => Credential ``` ## [](#description)Description Replaces all fields in a [credential](../../../../learn/security/tokens/#credentials), represented as a [`Credential` document](../), with fields from a provided data object. Fields not present in the data object, excluding the `id`, `coll`, and `ts` metadata fields, are removed. A credential associates a password with an [identity document](../../../../learn/security/tokens/#identity-document). You can use credentials and the [`credential.login()`](../login/) method to create [tokens](../../../../learn/security/tokens/) as part of an [end-user authentication system](../../../../build/tutorials/auth/). ### [](#metadata-fields)Metadata fields You can’t use this method to replace the following [metadata fields](../../../../learn/data-model/documents/#meta): * `id` * `coll` * `ts` ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | true | Fields for the Credential document. Fields not present, excluding the id, coll, and ts metadata fields, in the object are removed.For supported document fields, see Credential collection.The object can’t include the following metadata fields:* id * coll * ts | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Credential | Credential document with replaced fields. | ## [](#examples)Examples ```fql Credential.byId("412654807560487424")!.replace({ document: Customer.byId("111"), password: "sekret" }) ``` ``` { id: "412654807560487424", coll: Credential, ts: Time("2099-07-28T03:42:54.650Z"), document: Customer("111") } ``` # `credential.update()` | Learn: Credentials | | --- | --- | --- | Update a [credential](../../../../learn/security/tokens/#credentials). ## [](#signature)Signature ```fql-sig update(data: { *: Any }) => Credential ``` ## [](#description)Description Updates a [credential](../../../../learn/security/tokens/#credentials), represented as a [`Credential` document](../), with fields from a provided data object. During the update, fields from the data object are copied to the document, creating new fields or updating existing fields. The operation is similar to a merge. ### [](#nested-fields)Nested fields Fields with nested objects in the data object are merged with the identically named nested object in the document. ### [](#remove-a-field)Remove a field To remove a document field, set its value in the data object to `null`. ### [](#metadata-fields)Metadata fields You can’t use this method to insert or edit the following [metadata fields](../../../../learn/data-model/documents/#meta): * `id` * `coll` * `ts` ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | true | Document fields for the Credential document.For supported document fields, see Credential collection.The object can’t include the following metadata fields:* id * coll * ts | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Document | The updated Credential document. | ## [](#examples)Examples ```fql Credential.byId("412654807560487424")!.update({ password: "newest-sekret" }) ``` ``` { id: "412654807560487424", coll: Credential, ts: Time("2099-08-14T23:50:22.420Z"), document: Customer("111") } ``` # `credential.verify()` | Learn: Credentials | | --- | --- | --- | Test whether a provided password is valid for a [credential](../../../../learn/security/tokens/#credentials). ## [](#signature)Signature ```fql-sig verify(secret: String) => Boolean ``` ## [](#description)Description Verify a password against a `Credential` document. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | secret | String | true | Password to compare against the credential document. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | The secret status. A status of true means the password is valid, while false means it is invalid. | ## [](#examples)Examples ```fql Credential.byId("412654807560487424")!.verify("fauna-demo") ``` ``` true ``` # Database | Learn: Databases and multi-tenancy | | --- | --- | --- | In Fauna, a database stores data as [documents](../../../learn/data-model/documents/) in one or more [collections](../../../learn/data-model/collections/). Fauna databases support a hierarchical database structure with top-level and child databases. ## [](#collection)`Database` collection Fauna stores metadata and settings for a database’s child databases as documents in the `Database` system collection. These documents have the [DatabaseDef](../../fql/types/#databasedef) type. `Database` documents have the following FQL structure: ``` { name: "child_db", coll: Database, ts: Time("2099-06-24T21:53:40.670Z"), typechecked: true, protected: false, priority: 10, global_id: "ysjpygonryyr1", data: { desc: "Prod child database" } } ``` | Field | Type | Read-only | Required | Description | | --- | --- | --- | --- | --- | --- | --- | | name | String | | true | Name of the database. | | coll | String | true | | Collection name: Database. | | ts | Time | true | | Last time the document was created or updated. | | priority | Number | Null | | | User-defined priority assigned to the database. | | typechecked | Boolean | Null | | | If true, typechecking is enabled for the database. If false, typechecking is disabled.Inherits the parent database’s typechecking setting. | | protected | Boolean | Null | | | If true, protected mode is enabled for the database. If false, protected mode is disabled.Inherits the parent database’s protected mode setting. | | global_id | String | true | | Auto-generated, globally unique ID for the database. See Global database ID. | | data | { *: Any } | Null | | | Arbitrary user-defined metadata for the document. | ## [](#scope)Scope The `Database` collection only contains documents for the direct child databases of the database scoped to your authentication secret. You can’t use the `Database` collection to access parent, peer, or other descendant databases. Using FQL to create or manage top-level databases is not supported. ## [](#static-methods)Static methods You can use the following static methods to manage the `Database` collection in FQL. | Method | Description | | --- | --- | --- | --- | | Database.all() | Deletes a child database. | | Database.byName() | Get a child database by its name. | | Database.create() | Create a child database. | | Database.firstWhere() | Get the first child database document that matches a provided predicate. | | Database.toString() | Get "Database" as a String. | | Database.where() | Get a Set of child databases that match a provided predicate. | ## [](#instance-methods)Instance methods You can use the following instance methods to manage specific `Database` documents in FQL. | Method | Description | | --- | --- | --- | --- | | database.delete() | Deletes a child database. | | database.exists() | Test if a child database exists. | | database.replace() | Replace a child database's metadata and settings. | | database.update() | Update a child database's metadata and settings. | # `Database.all()` | Learn: Databases and multi-tenancy | | --- | --- | --- | Get a Set of all [child databases](../../../../learn/data-model/databases/#child) nested directly under the database. ## [](#signature)Signature ```fql-sig Database.all() => Set Database.all(range: { from: Any } | { to: Any } | { from: Any, to: Any }) => Set ``` ## [](#description)Description Gets a Set containing all [child databases](../../../../learn/data-model/databases/#child), represented as [`Database` documents](../), nested directly under the database to which the query’s [authentication secret](../../../../learn/security/authentication/#secrets) is scoped. To limit the returned Set, you can provide an optional range. If this method is the last expression in a query, the first page of the Set is returned. See [Pagination](../../../../learn/query/pagination/). ### [](#scope)Scope The `Database` collection only contains documents for the direct child databases of the database scoped to your authentication secret. You can’t use the `Database` collection to access parent, peer, or other descendant databases. Using FQL to create or manage top-level databases is not supported. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | range | { from: Any } | { to: Any } | { from: Any, to: Any } | | Specifies a range of Database documents in the form { from: start, to: end }. from and to arguments should be in the order returned by an unbounded Database.all() call. See Range examples.The Set only includes documents in this range (inclusive). Omit from or to to run unbounded range searches.If a range is omitted, all credentials are returned. | ### [](#range-parameters)Range parameters | Name | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | from | Any | | Beginning of the range (inclusive). Must be an Database document. | | to | Any | | End of the range (inclusive). Must be an Database document. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set of Database documents in the provided range. If a range is omitted, all credentials are returned.The Set is empty if:The database has no child databases.There are no child databases in the provided range.The provided range’s from value is greater than to. | ## [](#examples)Examples ### [](#range)Range examples ```fql Database.all() ``` ``` { data: [ { name: "childDB", coll: Database, ts: Time("2099-06-24T21:54:38.890Z"), typechecked: true, priority: 10, global_id: "ysjpykbahyyr1" }, ... ] } ``` ```fql Database.all({ from: Database.byName("childDB") }) ``` ``` { data: [ { name: "childDB", coll: Database, ts: Time("2099-06-24T21:54:38.890Z"), typechecked: true, priority: 10, global_id: "ysjpykbahyyr1" }, ... ] } ``` # `Database.byName()` | Learn: Databases and multi-tenancy | | --- | --- | --- | Get a [child database](../../../../learn/data-model/databases/#child) by its name. ## [](#signature)Signature ```fql-sig Database.byName(name: String) => NamedRef ``` ## [](#description)Description Gets a [child database](../../../../learn/data-model/databases/#child), represented as an [`Database` document](../), by its name. Fauna stores child databases as documents in the parent database’s [`Database` system collection](../). `Database` documents have the [DatabaseDef](../../../fql/types/#databasedef) type. ### [](#scope)Scope The `Database` collection only contains documents for the direct child databases of the database scoped to your authentication secret. You can’t use the `Database` collection to access parent, peer, or other descendant databases. Using FQL to create or manage top-level databases is not supported. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | name | String | true | name of the Database document to retrieve. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | NamedRef | Resolved reference to the Database document. Can resolve to an existing document or a NullDoc. | ## [](#examples)Examples ```fql Database.byName("childDB") ``` ``` { name: "childDB", coll: Database, ts: Time("2099-06-24T21:54:38.890Z"), global_id: "ysjpykbahyyr1", priority: 10, typechecked: true } ``` # `Database.create()` | Learn: Databases and multi-tenancy | | --- | --- | --- | Create a [child database](../../../../learn/data-model/databases/#child). ## [](#signature)Signature ```fql-sig Database.create(data: { name: String, priority: Number | Null, typechecked: Boolean | Null, protected: Boolean | Null, data: { *: Any } | Null }) => DatabaseDef ``` ## [](#description)Description Creates a [child database](../../../../learn/data-model/databases/#child) with the provided metadata and settings. The parent database is the database to which the query’s the [authentication secret](../../../../learn/security/authentication/#secrets) is scoped. Using `Database.create()` to create a top-level database is not supported. Fauna stores child databases as documents in the parent database’s [`Database` system collection](../). `Database` documents have the [DatabaseDef](../../../fql/types/#databasedef) type. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | true | Document fields containing metadata and settings for the new Database document.For supported document fields, see Database collection. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | DatabaseDef | The new Database document. | ## [](#examples)Examples ```fql Database.create({ name: "childDB", typechecked: true, priority: 10 }) ``` ``` { name: "childDB", coll: Database, ts: Time("2099-06-24T21:53:40.670Z"), typechecked: true, priority: 10, global_id: "ysjpygonryyr1" } ``` # `Database.firstWhere()` | Learn: Databases and multi-tenancy | | --- | --- | --- | Get the first [child database](../../../../learn/data-model/databases/#child) document that matches a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig Database.firstWhere(pred: (DatabaseDef => Boolean)) => DatabaseDef | Null ``` ## [](#description)Description Gets the first [child database](../../../../learn/data-model/databases/#child), represented as an [`Database` document](../), that matches a provided [predicate function](../../../fql/functions/#predicates). Fauna stores child databases as documents in the parent database’s [`Database` system collection](../). `Database` documents have the [DatabaseDef](../../../fql/types/#databasedef) type. ### [](#scope)Scope The `Database` collection only contains documents for the direct child databases of the database scoped to your authentication secret. You can’t use the `Database` collection to access parent, peer, or other descendant databases. Using FQL to create or manage top-level databases is not supported. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pred | Predicate function | Yes | Anonymous predicate function that:Accepts a Database document as its only argument. Supports shorthand-syntax.Returns a Boolean value.The method returns the first Database document for which the predicate returns true. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | DatabaseDef | First Database document that matches the predicate. | | Null | No Database document matches the predicate. | ## [](#examples)Examples ```fql Database.firstWhere(.name.includes("child")) ``` ``` { name: "childDB", coll: Database, ts: Time("2099-06-24T21:54:38.890Z"), typechecked: true, global_id: "ysjpykbahyyr1", priority: 10 } ``` ```fql Database.firstWhere(childDB => childDB.priority > 5) ``` ``` { name: "childDB", coll: Database, ts: Time("2099-06-24T21:54:38.890Z"), global_id: "ysjpykbahyyr1", priority: 10, typechecked: true } ``` # `Database.toString()` | Learn: Databases and multi-tenancy | | --- | --- | --- | Get `"Database"` as a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig Database.toString() => String ``` ## [](#description)Description Returns the name of the [`Database` collection](../) as a [String](../../../fql/types/#string). ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | "Database" | ## [](#examples)Examples ```fql Database.toString() ``` ``` "Database" ``` # `Database.where()` | Learn: Databases and multi-tenancy | | --- | --- | --- | Get a Set of [child databases](../../../../learn/data-model/databases/#child) that match a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig Database.where(pred: (DatabaseDef => Boolean)) => Set ``` ## [](#description)Description Gets a Set of [child databases](../../../../learn/data-model/databases/#child), represented as [`Database` documents](../), that match a provided [predicate function](../../../fql/functions/#predicates). Fauna stores child databases as documents in the parent database’s [`Database` system collection](../). `Database` documents have the [DatabaseDef](../../../fql/types/#databasedef) type. If `Database.where()` is the last expression in a query, the first page of the `Set` is returned. See [Pagination](../../../../learn/query/pagination/). ### [](#scope)Scope The `Database` collection only contains documents for the direct child databases of the database scoped to your authentication secret. You can’t use the `Database` collection to access parent, peer, or other descendant databases. Using FQL to create or manage top-level databases is not supported. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pred | Predicate function | Yes | Anonymous predicate function that:Accepts an Database document as its only argument. Supports shorthand-syntax.Returns a Boolean value.The method returns a Set of Database documents for which the predicate returns true. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set of Database documents that match the predicate. If there are no matching documents, the Set is empty. | ## [](#examples)Examples ```fql Database.where(.name.includes("child")) ``` ``` { data: [ { name: "childDB", coll: Database, ts: Time("2099-06-24T21:54:38.890Z"), global_id: "ysjpykbahyyr1", priority: 10, typechecked: true }, ... ] } ``` ```fql Database.where(childDB => childDB.priority == 10) ``` ``` { data: [ { name: "childDB", coll: Database, ts: Time("2099-06-24T21:54:38.890Z"), typechecked: true, global_id: "ysjpykbahyyr1", priority: 10 }, ... ] } ``` # `database.delete()` | Learn: Databases and multi-tenancy | | --- | --- | --- | Deletes a [child database](../../../../learn/data-model/databases/#child). ## [](#signature)Signature ```fql-sig delete() => NullDatabaseDef ``` ## [](#description)Description Deletes an [child database](../../../../learn/data-model/databases/#child), represented as a [`Database` document](../). The parent database is the database to which the query’s the [authentication secret](../../../../learn/security/authentication/#secrets) is scoped. Using `Database.create()` to create a top-level database is not supported. Fauna stores child databases as documents in the parent database’s [`Database` system collection](../). `Database` documents have the [DatabaseDef](../../../fql/types/#databasedef) type. ### [](#considerations)Considerations When you delete a database, its data becomes inaccessible and is asynchronously deleted. As part of the deletion process, Fauna recursively deletes: * Any keys scoped to the database. * The database’s child databases, including any nested databases. Deleting a database with a large number of keys can exceed Transactional Write Ops throughput limits. This can cause [throttling errors](../../../http/reference/errors/#rate-limits) with a `limit_exceeded` [error code](../../../http/reference/errors/#error-codes) and a 429 HTTP status code. Deleting a database with a large number of child databases can cause timeout errors with a `time_out` [error code](../../../http/reference/errors/#error-codes) and a 440 HTTP status code. To avoid throttling or timeouts, incrementally delete all keys and child databases before deleting the database. See [delete all keys](../../key/delete/#delete-all-keys) and [delete all child databases](#delete-all-dbs). ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | NullDatabaseDef | Document doesn’t exist. See NullDoc. | ## [](#examples)Examples ### [](#basic-ex)Basic example ```fql Database.byName("childDB")!.delete() ``` ``` Database.byName("childDB") /* deleted */ ``` ### [](#delete-all-dbs)Delete all child databases To [avoid timeouts](../../../../learn/data-model/databases/#delete-cons), you can incrementally delete all child databases for a database before deleting the database itself. To stay within [transaction size limits](../../../requirements-limits/#glimits), use [`set.paginate()`](../../set/paginate/) to perform the deletions over several queries instead of one. ```fql // Gets all `Database` system collection documents. // Uses `pageSize()` to limit the page size. // Uses `paginate()` to project the after cursor. let page = Database.all().pageSize(200).paginate() // `paginate()` returns an object. The object's `data` property // contains an Array of `Database` documents. let data = page.data // Use `forEach()` to delete each `Database` document in the // `data` Array. data.forEach(doc => doc.delete()) // Project the `after` cursor returned by `paginate()`. // Use the cursor to iterate through the remaining pages. page { after } ``` ``` { after: "hdWDxoq..." } ``` Subsequent queries use the cursor and [`Set.paginate()`](../../set/static-paginate/) to iterate through the remaining pages: ```fql // Uses `Set.paginate()` to iterate through pages. let page = Set.paginate("hdWDxoq...") let data = page.data data.forEach(doc => doc.delete()) page { after } ``` # `database.exists()` | Learn: Databases and multi-tenancy | | --- | --- | --- | Test if a [child database](../../../../learn/data-model/databases/#child) exists. ## [](#signature)Signature ```fql-sig exists() => Boolean ``` ## [](#description)Description Tests if a [child database](../../../../learn/data-model/databases/#child), represented as an [`Database` document](../), exists. The parent database is the database to which the query’s the [authentication secret](../../../../learn/security/authentication/#secrets) is scoped. Using `database.create()` to check the existence of a top-level database is not supported. Fauna stores child databases as documents in the parent database’s [`Database` system collection](../). `Database` documents have the [DatabaseDef](../../../fql/types/#databasedef) type. ### [](#exists-vs-null-comparisons)`exists()` vs. null comparisons You can use either `exists()` or a null comparison (`== null` or `!= null`) to check the existence or validity of a value. For example: ```fql Database.byName("childDB").exists() // true Database.byName("childDB") != null // true ``` Key differences: * `exists()` returns an error if called on an unsupported value. * Null comparisons do not throw errors and work safely on any value. For example: ```fql // Declare an object. Objects don't support // an `exists()` method. let object = { a: "Foo", b: "Bar" } object.exists() // Returns `invalid_query` error object != null // Returns true ``` ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the Database document exists. If false, the Database document doesn’t exist. | ## [](#examples)Examples ```fql Database.byName("childDB").exists() ``` ``` true ``` ```fql Database.byName("noChildDB").exists() ``` ``` false ``` # `database.replace()` | Learn: Databases and multi-tenancy | | --- | --- | --- | Replace a [child database](../../../../learn/data-model/databases/#child)'s metadata and settings. ## [](#signature)Signature ```fql-sig replace(data: { *: Any }) => DatabaseDef ``` ## [](#description)Description Replaces all fields in a [child database](../../../../learn/data-model/databases/#child)'s metadata and settings, represented as an [`Database` document](../), with fields from a provided data object. Fields not present in the data object, excluding the `coll` and `ts` metadata fields, are removed. Fauna stores child databases as documents in the parent database’s [`Database` system collection](../). `Database` documents have the [DatabaseDef](../../../fql/types/#databasedef) type. ### [](#metadata-fields)Metadata fields You can’t use this method to insert or edit the following [metadata fields](../../../../learn/data-model/documents/#meta): * `coll` * `ts` ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | | Fields for the Database document. Fields not present, excluding the coll and ts metadata fields, in the object are removed.For supported document fields, see Database collection.The object can’t include the following metadata fields:collts | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | DatabaseDef | Database document with replaced fields. | ## [](#examples)Examples ```fql Database.byName("childDB")!.replace({name: "childDB2"}) ``` ``` { name: "childDB2", coll: Database, ts: Time("2099-06-24T21:54:13.225Z"), global_id: "ysjpyeykhyyr4" } ``` # `database.update()` | Learn: Databases and multi-tenancy | | --- | --- | --- | Update a [child database](../../../../learn/data-model/databases/#child)'s metadata and settings. ## [](#signature)Signature ```fql-sig update(data: { *: Any }) => DatabaseDef ``` ## [](#description)Description Updates a [child database](../../../../learn/data-model/databases/#child)'s metadata and settings, represented as a [`Database` document](../), with fields from a provided data object. Fauna stores child databases as documents in the parent database’s [`Database` system collection](../). `Database` documents have the [DatabaseDef](../../../fql/types/#databasedef) type. During the update, fields from the data object are copied to the document, creating new fields or updating existing fields. The operation is similar to a merge. ### [](#nested-fields)Nested fields Fields with nested objects in the data object are merged with the identically named nested object in the document. ### [](#remove-a-field)Remove a field To remove a document field, set its value in the data object to `null`. ### [](#metadata-fields)Metadata fields You can’t use this method to insert or edit the following [metadata fields](../../../../learn/data-model/documents/#meta): * `coll` * `ts` ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | true | Document fields for the Database document.For supported document fields, see Database collection.The object can’t include the following metadata fields:* coll * ts | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | DatabaseDef | The updated Database document. | ## [](#examples)Examples ```fql Database.byName("childDB")!.update({typechecked: false}) ``` ``` { name: "childDB", coll: Database, ts: Time("2099-06-24T21:58:19.346Z"), priority: 10, global_id: "ysjpykbahyyr1", typechecked: false } ``` # Date [Date](../../fql/types/#date) methods and properties. ## [](#description)Description [Date](../../fql/types/#date) functions are provided to represent dates without a time. ## [](#instance-properties)Instance properties | Method | Description | | --- | --- | --- | --- | | dayOfMonth | Get the day of the month from a Date. | | dayOfWeek | Get the day of the week from a Date. | | dayOfYear | Get the day of the year from a Date. | | month | Get the month of a Date. | | year | Get the year of a Date. | ## [](#static-methods)Static methods | Method | Description | | --- | --- | --- | --- | | Date() | Construct a Date from a ISO 8601 date String. | | Date.fromString() | Construct a Date from a date String. | | Date.today() | Get the current UTC Date. | ## [](#instance-methods)Instance methods | Method | Description | | --- | --- | --- | --- | | date.add() | Add number of days to a Date. | | date.difference() | Get the difference between two Dates. | | date.subtract() | Subtract number of days from a Date. | | date.toString() | Convert a Date to a String. | # `dayOfMonth` Get the day of the month from a [Date](../../../fql/types/#date). ## [](#signature)Signature ```fql-sig dayOfMonth: Number ``` ## [](#description)Description Extracts the day-of-month from the instance date. ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Day of month. | ## [](#examples)Examples ```fql Date('2099-02-10').dayOfMonth ``` ``` 10 ``` # `dayOfWeek` Get the day of the week from a [Date](../../../fql/types/#date). ## [](#signature)Signature ```fql-sig dayOfWeek: Number ``` ## [](#description)Description Extracts the day-of-week from the instance date. ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Day of week.1 = Monday7 = Sunday | ## [](#examples)Examples ```fql Date('2099-02-10').dayOfWeek ``` ``` 2 ``` # `dayOfYear` Get the day of the year from a [Date](../../../fql/types/#date). ## [](#signature)Signature ```fql-sig dayOfYear: Number ``` ## [](#description)Description Extracts the day-of-year from the instance date. ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Day of year. | ## [](#examples)Examples ```fql Date('2099-02-10').dayOfYear ``` ``` 41 ``` # `month` Get the month of a [Date](../../../fql/types/#date). ## [](#signature)Signature ```fql-sig month: Number ``` ## [](#description)Description Extract the month from the instance date. ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Month part of Date. | ## [](#examples)Examples ```fql Date('2099-02-10').month ``` ``` 2 ``` # `year` Get the year of a [Date](../../../fql/types/#date). ## [](#signature)Signature ```fql-sig year: Number ``` ## [](#description)Description Extract the year from the instance date. ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Year field of the calling date object. | ## [](#examples)Examples ```fql Date('2099-02-10').year ``` ``` 2099 ``` # `Date()` Construct a [Date](../../../fql/types/#date) from a ISO 8601 date [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig Date(date: String) => Date ``` ## [](#description)Description The `Date()` method converts a `YYYY-MM-DD` [String](../../../fql/types/#string) to a [Date](../../../fql/types/#date). The method accepts only a date [String](../../../fql/types/#string) and returns an error if time information is included in the [String](../../../fql/types/#string). [Date](../../../fql/types/#date) objects render to date strings in query responses. This method is equivalent to [`Date.fromString()`](../fromstring/). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | date | String | true | Date String in the YYYY-MM-DD format. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Date | Date representation of the string. | ## [](#examples)Examples Convert a date string to a [Date](../../../fql/types/#date): ```fql Date("2099-10-20") ``` ``` Date("2099-10-20") ``` # `Date.fromString()` Construct a [Date](../../../fql/types/#date) from a date [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig Date.fromString(date: String) => Date ``` ## [](#description)Description The `Date.fromString()` method converts a `YYYY-MM-DD` date [String](../../../fql/types/#string) to a [Date](../../../fql/types/#date). The method accepts only a date [String](../../../fql/types/#string) and returns an error if time information is included in the [String](../../../fql/types/#string). [Date](../../../fql/types/#date) objects render to date strings in query responses. This method is equivalent to [`Date.fromString()`](./). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | date | String | true | Date String in the form yyyy-MM-dd. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Date | Date converted from string representation of the date. | ## [](#examples)Examples Convert a date string to a [Date](../../../fql/types/#date): ```fql Date.fromString("2099-10-20") ``` ``` Date("2099-10-20") ``` # `Date.today()` Get the current UTC [Date](../../../fql/types/#date). ## [](#signature)Signature ```fql-sig Date.today() => Date ``` ## [](#description)Description The `Date.today()` method gets the current [Date](../../../fql/types/#date). The returned [Date](../../../fql/types/#date) is the current [UTC](https://www.itu.int/dms_pubrec/itu-r/rec/tf/R-REC-TF.460-6-200202-I!!PDF-E.pdf) date. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Date | Object representing the Current year, month, and day, in UTC. | ## [](#examples)Examples Get the current date: ```fql Date.today() ``` ``` Date("2099-06-24") ``` # `date.add()` Add number of days to a [Date](../../../fql/types/#date). ## [](#signature)Signature ```fql-sig add(amount: Number, unit: String) => Date ``` ## [](#description)Description Adds a number of days to a [Date](../../../fql/types/#date), returning the resulting [Date](../../../fql/types/#date). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | amount | Number | true | Number of units to add to the given date. | | unit | Number | true | Unit for the operation. Must be days (case-sensitive). | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Date | Date with the added days. | ## [](#examples)Examples ```fql Date('2099-02-10').add(19, 'days') ``` ``` Date("2099-03-01") ``` # `date.difference()` Get the difference between two [Date](../../../fql/types/#date)s. ## [](#signature)Signature ```fql-sig difference(start: Date) => Number ``` ## [](#description)Description Subtract a date from the instance date to get the difference in number of days. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | start | Date | true | Date to subtract from the instance date. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Difference between instance date and the provided start date, in days. | ## [](#examples)Examples ```fql Date('2099-02-10').difference(Date('2099-01-01')) ``` ``` 40 ``` # `date.subtract()` Subtract number of days from a [Date](../../../fql/types/#date). ## [](#signature)Signature ```fql-sig subtract(amount: Number, unit: String) => Date ``` ## [](#description)Description Subtracts a provided number of days from a [Date](../../../fql/types/#date). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | amount | Number | true | Number of units to subtract from the given date. | | unit | Number | true | Unit for the operation. Must be days (case-sensitive). | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Date | Resulting date | ## [](#examples)Examples ```fql Date('2099-02-10').subtract(41, 'days') ``` ``` Date("2098-12-31") ``` # `date.toString()` Convert a [Date](../../../fql/types/#date) to a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig toString() => String ``` ## [](#description)Description Converts the calling [Date](../../../fql/types/#date) to an ISO 8601 [String](../../../fql/types/#string). ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | String representation of the calling Date. | ## [](#examples)Examples ```fql let d = Date("2099-10-20") d.toString() ``` ``` "2099-10-20" ``` # Document | Learn: Documents | | --- | --- | --- | You add data to Fauna as JSON-like [documents](../../../learn/data-model/documents/), stored in [collections](../../../learn/data-model/collections/). ## [](#doc-fields)Document fields All documents contain the `id`, `coll`, `ts`, and optional `ttl` [metadata fields](../../../learn/data-model/documents/#meta). Documents in user-defined collections also typically contain user-defined fields. For example: ``` { id: "392886847463751746", coll: Product, ts: Time("2099-04-10T16:50:12.850Z"), ttl: Time("2099-04-15T16:50:12.850Z"), name: "key limes", description: "Conventional, 16 oz bag", price: 299, stock: 100, category: Category("401610017107607625") } ``` | Field | Type | Read-only | Required | Description | | --- | --- | --- | --- | --- | --- | --- | | id | ID | true | | ID for the document. The ID is a string-encoded, 64-bit unsigned integer in the 253-1 range. The ID is unique within the collection.IDs are assigned at document creation. To create a document with a user-provided id using collection.create(), you must use a secret with the create_with_id privilege. If not provided, Fauna generates the id. | | coll | Collection | true | | Name of the document’s collection. he coll field can’t be indexed. | | ts | Time | true | | Last time the document was created or updated. | | ttl | Time or null | | | Time-to-live (TTL) for the document. Only present if set. If not present or set to null, the document persists indefinitely. | | | Any supported data type | | | User-defined document field.Schema method names and schema metadata field names are reserved and can’t be used as a field name but can be used in nested objects.You can enforce typing and constraints for user-defined fields in a collection using collection schema. | | data | Object | | true | A reserved field that contains all user-defined fields and their values.By default, the data field isn’t returned in query results. However, if typechecking is disabled, you can project the field to return it.The data field does not contain computed fields or metadata fields, such as id, coll, ts, or ttl.You can use the data field to safely nest user-defined fields that have reserved field names, such as id or ttl, in a document. See Data field and Avoid conflicts with reserved fields in the v10 migration docs. | ## [](#instance-methods)Instance methods You can use the following instance methods to manage a document in FQL. | Method | Description | | --- | --- | --- | --- | | document.delete() | Delete a collection document. | | document.exists() | Test if a collection document exists. | | document.replace() | Replace all fields in a collection document. | | document.replaceData() | Replace a collection document using an object that may contain metadata fields. | | document.update() | Update a collection document's fields. | | document.updateData() | Update a collection document using an object that may contain metadata fields. | # `document.delete()` | Learn: Documents | | --- | --- | --- | Delete a [collection document](../../../../learn/data-model/documents/). ## [](#signature)Signature ```fql-sig delete() => ``` ## [](#description)Description Deletes a [collection document](../../../../learn/data-model/documents/). ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | | Document doesn’t exist.The data type is taken from the collection’s name with the Null prefix. For example, a NullDoc for the Product collection has the NullProduct type. See NullDoc. | ## [](#examples)Examples The document exists: ```fql // Uses the `Product` collection's `byName()` index and // the `first()` method to get a single document. let product = Product.byName("cups").first() product!.delete() ``` ``` Product("111") /* deleted */ ``` The document doesn’t exist: ```fql Product.byId("12345")!.delete() ``` ``` document_not_found: Collection `Product` does not contain document with id 12345. error: Collection `Product` does not contain document with id 12345. at *query*:1:13 | 1 | Product.byId("12345")!.delete() | ^^^^^^^^^^ | ``` # `document.exists()` | Learn: Documents | | --- | --- | --- | Test if a [collection document](../../../../learn/data-model/documents/) exists. ## [](#signature)Signature ```fql-sig exists() => Boolean ``` ## [](#description)Description The `exists()` method tests if a collection document exists. ### [](#exists-vs-null-comparisons)`exists()` vs. null comparisons You can use either `exists()` or a null comparison (`== null` or `!= null`) to check the existence or validity of a value. For example: ```fql Product.byId("111").exists() // true Product.byId("111") != null // true ``` Key differences: * `exists()` returns an error if called on an unsupported value. * Null comparisons do not throw errors and work safely on any value. For example: ```fql // Declare an object. Objects don't support // an `exists()` method. let object = { a: "Foo", b: "Bar" } object.exists() // Returns `invalid_query` error object != null // Returns true ``` ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | true = The document exists.false = The document doesn’t exist. | ## [](#examples)Examples ```fql Product.byId("222").exists() ``` ``` true ``` # `document.replace()` | Learn: Documents | | --- | --- | --- | Replace all fields in a [collection document](../../../../learn/data-model/documents/). ## [](#signature)Signature ```fql-sig replace(data: { ttl: Time | Null, data: Null, *: Any }) => ``` ## [](#description)Description Replaces all fields in an collection document with fields from a provided data object. Fields not present in the data object, excluding the `id`, `coll`, `ts`, and `data` metadata fields, are removed. ### [](#metadata-fields)Metadata fields You can’t use this method to replace the following [metadata fields](../../../../learn/data-model/documents/#meta): * `id` * `coll` * `ts` * `data` ### [](#reserved-fields)Reserved fields You can’t use this method to insert or edit the following [metadata fields](../../../../learn/data-model/documents/#meta): * `id` * `coll` * `ts` * `data` ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | true | Fields for the collection document. Fields not present in the object, excluding the id, coll, ts, and data metadata fields, are removed.The object can’t include the following metadata fields:idcolltsdata | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | | Collection document with replaced fields.A document’s data type is taken from its collection’s name. For example, Product for a document in the Product collection. See Document type. | ## [](#examples)Examples ### [](#basic)Basic Given the following document: ``` { id: "777", coll: Product, ts: Time("2099-04-10T16:50:12.850Z"), name: "limes", description: "Conventional, 16 oz bag", price: 2_99, stock: 30, category: Category("789") } ``` Call `replace()` with a replacement document object: ```fql Product.byId("777")?.replace({ name: "limes", description: "2 ct", price: 99, stock: 50, category: Category.byId("789") }) ``` ``` { id: "777", coll: Product, ts: Time("2099-04-10T17:54:37.670Z"), name: "limes", description: "2 ct", price: 99, stock: 50, category: Category("789") } ``` ### [](#default)Default values A [field definition](../../../../learn/schema/#field-definitions) can set a [default field value](../../../fsl/field-definitions/#default) for documents in a collection: ```fsl collection Customer { // `name` accepts `String` and `Null` values. // If missing, defaults to `unknown`. name: String? = "unknown" email: String } ``` If you don’t provide a value during document replacement, the document uses the default value: ```fql // Replaces a `Customer` document. Customer.byId("111")?.replace({ // The `name` field is missing. email: "john.doe@example.com" }) ``` ``` { id: "111", coll: Customer, ts: Time("2099-02-19T14:53:53.940Z"), cart: Order("413002506150347264"), orders: "hdW...", email: "john.doe@example.com", // `name` defaulted to `unknown`. name: "unknown" } ``` If you provide an explicit `null` value, the field is `null`. Fields with `null` values aren’t stored or returned. ```fql Customer.byId("111")?.replace({ // `name` is an explicit `null`. name: null, email: "jane.doe@example.com" }) ``` ``` { id: "111", coll: Customer, ts: Time("2099-02-19T14:53:53.940Z"), cart: Order("413002506150347264"), orders: "hdW...", // `name` is not stored or returned. email: "jane.doe@example.com" } ``` # `document.replaceData()` | Learn: Documents | | --- | --- | --- | Replace a [collection document](../../../../learn/data-model/documents/) using an object that may contain [metadata fields](../../../../learn/data-model/documents/). ## [](#signature)Signature ```fql-sig replaceData(data: { *: Any }) => ``` ## [](#description)Description Replaces all fields in an collection document with fields from a provided data object. Fields not present in the data object, excluding the `id`, `coll`, `ts`, and `data` metadata fields, are removed. This method differs from [`document.replace()`](../replace/) in how it handles reserved fields. See [Reserved fields](#reserved). ### [](#reserved)Reserved fields You can’t use this method to insert or edit the following [metadata fields](../../../../learn/data-model/documents/#meta): * `id` * `coll` * `ts` * `data` If the provided data object contains field names that conflict with these [metadata fields](../../../../learn/data-model/documents/#meta), the method safely nests values for fields with reserved names in the `data` field. See [Examples](#examples). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | true | Fields for the collection document. Fields not present in the object, excluding the id, coll, ts, and data metadata fields, are removed.If the object contains field names that conflict with these metadata fields, the method safely nests values for fields with reserved names in the data field. See Examples. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | | Collection document with replaced fields.A document’s data type is taken from its collection’s name. For example, Product for a document in the Product collection. See Document type. | ## [](#examples)Examples Given the following document: ``` { id: "777", coll: Product, ts: Time("2099-04-10T16:50:12.850Z"), name: "limes", description: "Conventional, 16 oz bag", price: 2_99, stock: 30, category: Category("789") } ``` Call `replaceData()` with `id` and `coll` fields in the replacement document object. These fields have the same name as reserved [metadata fields](../../../../learn/data-model/documents/). ```fql Product.byId("777")?.replaceData({ id: "12345", coll: "Product", name: "limes", description: "2 ct", price: 99, stock: 50, category: Category.byId("789") }) ``` ``` { id: "777", coll: Product, ts: Time("2099-04-10T17:54:37.670Z"), name: "limes", description: "2 ct", price: 99, stock: 50, category: Category("789"), data: { coll: "Product", id: "12345" } } ``` Rather than return an error, `replaceData()` treats any field with a reserved name as a document field and nests it in the document’s `data` property. # `document.update()` | Learn: Documents | | --- | --- | --- | Update a [collection document](../../../../learn/data-model/documents/)'s fields. ## [](#signature)Signature ```fql-sig update(data: { ttl: Time | Null, data: Null, *: Any }) => ``` ## [](#description)Description Updates a collection document with fields from a provided data object. During the update, fields from the data object are copied to the document, creating new fields or updating existing fields. The operation is similar to a merge. ### [](#metadata-fields)Metadata fields You can’t use this method to insert or edit the following [metadata fields](../../../../learn/data-model/documents/#meta): * `id` * `coll` * `ts` * `data` ### [](#nested-fields)Nested fields Fields with nested objects in the data object are merged with the identically named nested object in the document. ### [](#remove-a-field)Remove a field To remove a document field, set its value in the data object to `null`. ### [](#default-values)Default values A [field definition](../../../../learn/schema/#field-definitions) can set a [default field value](../../../fsl/field-definitions/#default) for documents in a collection. Default values are not inserted for missing or `null` fields during a document update. ### [](#reserved-fields)Reserved fields You can’t use this method to insert or edit the following [metadata fields](../../../../learn/data-model/documents/#meta): * `id` * `coll` * `ts` * `data` ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | true | Object with the updated document fields.The object can’t include the following metadata fields:idcolltsdata | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | | The updated collection document.A document’s data type is taken from its collection’s name. For example, Product for a document in the Product collection. See Document type. | ## [](#examples)Examples Given the following document: ``` { id: "777", coll: Product, ts: Time("2099-04-10T16:50:12.850Z"), name: "limes", description: "Conventional, 16 oz bag", price: 2_99, stock: 30, category: Category("789") } ``` Call `update()` with an object containing updated document fields: ```fql Product.byId("777")?.update({ name: "key limes", stock: 100 }) ``` ``` { id: "777", coll: Product, ts: Time("2099-04-10T16:50:12.850Z"), name: "key limes", description: "Conventional, 16 oz bag", price: 299, stock: 100, category: Category("789") } ``` ### [](#remove-a-field-2)Remove a field To remove a document field, set its value to `null`. Fields with `null` values aren’t stored or returned. ```fql Product.byId("777")?.update({ stock: null }) ``` ``` { id: "777", coll: Product, ts: Time("2099-04-10T17:59:50.970Z"), name: "limes", description: "Conventional, 16 oz bag", price: 299, category: Category("789") } ``` # `document.updateData()` | Learn: Documents | | --- | --- | --- | Update a [collection document](../../../../learn/data-model/documents/) using an object that may contain [metadata fields](../../../../learn/data-model/documents/). ## [](#signature)Signature ```fql-sig updateData(data: { *: Any }) => ``` ## [](#description)Description Updates a collection document with fields from a provided data object. During the update, fields from the data object are copied to the document, creating new fields or updating existing fields. The operation is similar to a merge. This method differs from [`document.update()`](../update/) in how it handles reserved fields. See [Reserved fields](#reserved). ### [](#nested-fields)Nested fields Fields with nested objects in the data object are merged with the identically named nested object in the document. ### [](#remove-a-field)Remove a field To remove a document field, set its value in the data object to `null`. ### [](#default-values)Default values A [field definition](../../../../learn/schema/#field-definitions) can set a [default field value](../../../fsl/field-definitions/#default) for documents in a collection. Default values are not inserted for missing or `null` fields during a document update. ### [](#reserved)Reserved fields You can’t use this method to insert or edit the following [metadata fields](../../../../learn/data-model/documents/#meta): * `id` * `coll` * `ts` * `data` If the provided data object contains field names that conflict with these [metadata fields](../../../../learn/data-model/documents/#meta), the method safely nests values for fields with reserved names in the `data` field. See [Examples](#examples). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | true | Object with the updated document fields.If the object contains field names that conflict with these metadata fields, the method safely nests values for fields with reserved names in the data field. See Examples. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | | The updated collection document.A document’s data type is taken from its collection’s name. For example, Product for a document in the Product collection. See Document type. | ## [](#examples)Examples Given the following document: ``` { id: "392886847463751746", coll: Product, ts: Time("2099-04-10T16:50:12.850Z"), name: "limes", description: "Conventional, 16 oz bag", price: 299, stock: 30, category: Category("401610017107607625") } ``` Call `updateData()` with the `id` and `coll` fields in the object. These fields have the same name as reserved [metadata fields](../../../../learn/data-model/documents/). ```fql Product.byId("777")?.updateData({ id: "12345", coll: "Products", name: "key limes", stock: 100 }) ``` ``` { id: "777", coll: Product, ts: Time("2099-04-10T17:02:43.846Z"), cart: null, orders: "hdW...", name: "key limes", description: "Conventional, 16 oz bag", price: 299, stock: 100, category: Category("789"), data: { coll: "Products", id: "12345" } } ``` Rather than return an error, `updateData()` treats any field with a reserved name as a document field and nests it in the document’s `data` property. # EventSource | Learn: Event feeds and event streams | | --- | --- | --- | An [event source](../../../learn/cdc/) emits an event when tracked changes occur in a database. ## [](#create-an-event-source)Create an event source To create an event source, append [`set.eventSource()`](../set/eventsource/) or [`set.eventsOn()`](../set/eventson/) to a [supported Set](../../../learn/cdc/#sets): ```fql // Tracks all changes to the `Product` collection. Product.all().eventSource() // Tracks all changes to the `name`, `price`, // and `stock` fields in `Product` documents. Product.all().eventsOn(.name, .price, .stock) ``` The query returns a string-encoded token that represents the event source. The token has the [EventSource](../../fql/types/#event-source) type: ```json "g9WD1YPG..." ``` When consumed as an [event feed or event stream](../../../learn/cdc/), the event source emits JSON events when a tracked change occurs: ```json { "type": "add", "data": { "@doc": { "id": "392914348360597540", "coll": { "@mod": "Product" }, "ts": { "@time": "2099-03-20T21:46:12.580Z" }, "name": "cups", ... } }, "txn_ts": 1710968002310000, "cursor": "gsGabc123", "stats": { "read_ops": 8, "storage_bytes_read": 208, "compute_ops": 1, "processing_time_ms": 0, "rate_limits_hit": [] } } ``` For event properties, see [event schema](../../../learn/cdc/#event-schema). ## [](#consume-an-event-source)Consume an event source Applications can consume an event source in two ways: * **[Event feeds](../../../learn/cdc/#event-feeds)** : Asynchronous requests that poll the event source for paginated events. * **[Event streams](../../../learn/cdc/#event-streaming)**: Real-time subscriptions that push events from the event source to your application using an open connection to Fauna. ## [](#instance-methods)Instance methods You can call the following instance methods on [event sources](../../../learn/cdc/) in FQL. | Method | Description | | --- | --- | --- | --- | | eventSource.map() | Apply an anonymous function to each element of an event source's tracked Set. | | eventSource.toString() | Get "[event source]" as a string. | | eventSource.where() | Create an event source that emits events for a subset of another event source’s tracked Set. | # `eventSource.map()` | Learn: Event feeds and event streams | | --- | --- | --- | Apply an [anonymous function](../../../fql/functions/) to each element of an [event source](../../../../learn/cdc/)'s [tracked Set](../../../../learn/cdc/#sets). ## [](#signature)Signature ```fql-sig map(mapper: (A => B)) => EventSource ``` ## [](#description)Description `map()` applies an anonymous, read-only [function](../../../fql/functions/) to each element of an existing [event source](../../../../learn/cdc/)'s [tracked Set](../../../../learn/cdc/#sets). `map()` returns a new event source. The new event source emits events that contain transformed elements in the event’s [`data` property](../../../../learn/cdc/#event-schema). `map()` does not change the calling event source. ### [](#use-cases)Use cases Common uses for `map()` include: * Transforming document structures for specific client needs * Combining multiple fields into a single value * Formatting data for external systems ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | mapper | Function | true | Anonymous, read-only function that operates on an existing event source's tracked Set elements.Writes are not permitted in the function. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | EventSource | String-encoded token for the new event source. The event source emits events for the original tracked Set. The emitted events contain transformations from the provided function. | ## [](#examples)Examples ### [](#basic-example)Basic example `Customer` collection documents have the following structure: ``` { id: "111", coll: Customer, ts: Time("2099-06-25T12:14:29.440Z"), cart: Order("412653216549831168"), orders: "hdW...", name: 'Alice Appleseed', email: 'alice.appleseed@example.com', address: { street: '87856 Mendota Court', city: 'Washington', state: 'DC', postalCode: '20220', country: 'US' } } ``` The following query uses `map()` to transform the document structure in events: ```fql Customer.all() .eventSource() .map( customer => { name: customer.name, // Transformation. Combines `address.city` and `address.state` // into a single `city` string. city: "#{customer.address.city}, #{customer.address.state}" } ) ``` ``` // String-encoded token for the new event source "g9WD1YPG..." ``` When consumed as an [event feed or event stream](../../../../learn/cdc/), the event source emits events with the transformed value in the `data` property: ``` { "type": "update", // The `data` prop contains transformed // `city` values. "data": { "name": "Alice Appleseed", "city": "Washington, DC" }, "txn_ts": 1730318669480000, "cursor": "gsG...", "stats": { "read_ops": 2, "storage_bytes_read": 738, "compute_ops": 1, "processing_time_ms": 9, "rate_limits_hit": [] } } ``` # `eventSource.toString()` | Learn: Event feeds and event streams | | --- | --- | --- | Get `"[event source]"` as a string. ## [](#signature)Signature ```fql-sig toString() => String ``` ## [](#description)Description `toString()` returns `"[event source]"` as a string. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | "[event source]" | ## [](#examples)Examples ```fql Customer.all().eventSource().toString() ``` ``` "[event source]" ``` # `eventSource.where()` | Learn: Event feeds and event streams | | --- | --- | --- | Create an [event source](../../../../learn/cdc/) that emits events for a subset of another event source’s [tracked Set](../../../../learn/cdc/#sets). ## [](#signature)Signature ```fql-sig where(predicate: (A => Boolean | Null)) => EventSource ``` ## [](#description)Description The `where()` method returns an [event source](../../../../learn/cdc/). The event source emits [events](../../../../learn/cdc/#events) for a subset of another event source’s [tracked Set](../../../../learn/cdc/#sets). The subset’s elements must match a provided [predicate function](../../../fql/functions/#predicates). The predicate must return a [Boolean](../../../fql/types/#boolean) or [Null](../../../fql/types/#null). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | predicate | Function | true | Anonymous predicate function that’s compared to an existing event source's Set elements. The function returns true for matches, false for mismatches, or Null. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | EventSource | String-encoded token for an event source. The event source emits events for Set elements that match the provided predicate function. | ## [](#examples)Examples ```fql // Only emit events for `Customer` documents // with an `.address.state` of `DC`. Customer.all() .eventSource() .where(.address.state == "DC") ``` ``` // String-encoded token for the new event source "g9WD1YPG..." ``` When consumed as an [event feed or event stream](../../../../learn/cdc/), the event source emits events for the subset matching the provided [predicate function](../../../fql/functions/#predicates): ``` // The new event source only emits events // for `Customer` documents with an // an `.address.state` of `DC`. { "type": "update", "data": { "@doc": { "id": "111", "coll": { "@mod": "Customer" }, "ts": { "@time": "2099-10-30T20:18:18.390Z" }, "cart": { "@ref": { "id": "413111684333305922", "coll": { "@mod": "Order" } } }, "orders": { "@set": "hdW..." }, "name": "Alice Appleseed", "email": "alice.appleseed@example.com", "address": { "street": "87857 Mendota Court", "city": "Washington", "state": "DC", "postalCode": "20220", "country": "US" } } }, "txn_ts": 1730319498390000, "cursor": "gsG...", "stats": { "read_ops": 3, "storage_bytes_read": 1049, "compute_ops": 1, "processing_time_ms": 50, "rate_limits_hit": [] } } ``` # FQL The `FQL` module provides access to all top-level FQL modules and functions. For example, you can use `FQL.Math` to access the [`Math`](../math/) module and its methods. `FQL.Math.abs(-3)` is equivalent to `Math.abs(-3)`. The `FQL` module also contains modules and functions that aren’t at the top level, such as [`FQL.Schema.defForIdentifier()`](schema-defforidentifier/). ## [](#static-methods)Static methods The following section covers `FQL` methods that aren’t available at the top level. ### [](#schema-methods)Schema methods The `Schema` module exists under the top-level `FQL` module. `Schema` has a single member: `defForIdentifier()`. | Method | Description | | --- | --- | --- | --- | | FQL.Schema.defForIdentifier() | Returns the definition for a user-defined collection or user-defined function (UDF) using the same rules as top-level identifier lookups. | # `FQL.Schema.defForIdentifier()` Returns the definition for a user-defined [collection](../../../../learn/data-model/collections/) or [user-defined function (UDF)](../../../../learn/schema/user-defined-functions/) using the same rules as top-level identifier lookups. ## [](#signature)Signature ```fql-sig FQL.Schema.defForIdentifier(ident: String) => Any ``` ## [](#description)Description `FQL.Schema.defForIdentifier()` returns the definition for a user-defined [collection](../../../../learn/data-model/collections/) or [UDF](../../../../learn/schema/user-defined-functions/) using the same rules as top-level identifier lookups. The lookup returns the first matching resource using the following precedence: 1. A user-defined collection where `.name == ` 2. A user-defined collection where `.alias == ` 3. A UDF where `.name == ` 4. A UDF where `.alias == ` The document is an FQL definition for the resource’s FSL schema. See: * [`Collection` documents](../../collection/#collection) * [`Function` documents](../../function/#collection) The method does not retrieve definitions for system collections or other resources. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | ident | String | true | Identifier for a user-defined collection or UDF. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Any | Definition for the resource. | ## [](#examples)Examples ### [](#get-a-user-defined-collections-definition)Get a user-defined collection’s definition ```fql // Gets the FQL definition for the `Product` collection. FQL.Schema.defForIdentifier('Product') ``` ``` { name: "Product", coll: Collection, ts: Time("2099-10-22T21:56:30.975Z"), history_days: 0, indexes: { byCategory: { terms: [ { field: ".category", mva: false } ], queryable: true, status: "complete" }, sortedByCategory: { values: [ { field: ".category", order: "asc", mva: false } ], queryable: true, status: "complete" }, byName: { terms: [ { field: ".name", mva: false } ], queryable: true, status: "complete" }, sortedByPriceLowToHigh: { values: [ { field: ".price", order: "asc", mva: false }, { field: ".name", order: "asc", mva: false }, { field: ".description", order: "asc", mva: false }, { field: ".stock", order: "asc", mva: false } ], queryable: true, status: "complete" } }, constraints: [ { unique: [ { field: ".name", mva: false } ], status: "active" }, { check: { name: "stockIsValid", body: "(product) => product.stock >= 0" } }, { check: { name: "priceIsValid", body: "(product) => product.price > 0" } } ], fields: { name: { signature: "String" }, description: { signature: "String" }, price: { signature: "Int" }, category: { signature: "Ref" }, stock: { signature: "Int" } } } ``` ### [](#get-a-udfs-definition)Get a UDF’s definition ```fql // Gets the FQL definition for the `validateOrderStatusTransition()` UDF. FQL.Schema.defForIdentifier('validateOrderStatusTransition') ``` ``` { name: "validateOrderStatusTransition", coll: Function, ts: Time("2024-10-28T15:11:25.460Z"), body: <<-END (oldStatus, newStatus) => { if (oldStatus == "cart" && newStatus != "processing") { abort("Invalid status transition.") } else if (oldStatus == "processing" && newStatus != "shipped") { abort("Invalid status transition.") } else if (oldStatus == "shipped" && newStatus != "delivered") { abort("Invalid status transition.") } } END } ``` # Function | Learn: User-defined functions (UDFs) | | --- | --- | --- | A [user-defined function (UDF)](../../../learn/schema/user-defined-functions/) is a set of one or more FQL statements stored as a reusable resource in a Fauna database. Like a stored procedure in SQL, a UDF can accept parameters, perform operations, and return results. ## [](#collection)`Function` collection Fauna stores UDFs as documents in the `Function` system collection. These documents have the [FunctionDef](../../fql/types/#functiondef) type and are an FQL version of the FSL [function schema](../../fsl/function/). `Function` documents have the following FQL structure: ``` { name: "getOrCreateCart", coll: Function, ts: Time("2099-09-25T21:53:08.780Z"), role: "server", body: <<-END (id) => { let customer = Customer.byId(id)! if (customer!.cart == null) { Order.create({ status: "cart", customer: Customer.byId(id), createdAt: Time.now(), payment: { } }) } else { customer!.cart } } END } ``` | Field | Type | Read-only | Required | Description | | --- | --- | --- | --- | --- | --- | --- | | name | String | | true | Name of the function.Can’t be documents, events, self, sets, or a single underscore _, and can’t include a space character. | | coll | String | true | | Collection name: Function. | | ts | Time | true | | Last time the document was created or updated. | | role | String | | | Associates a runtime role with the UDF. An FQL version of the @role annotation.By default, UDFs run with the privileges of the calling query’s authentication secret. If a role is provided, the UDF runs using the annotated role’s privileges, regardless of the secret used to call it.The Role can be a user-defined role or one of the following built-in roles:adminserverserver-readonlyThe role field is typically used to give a role controlled access to sensitive data without granting broader privileges. See Runtime privileges.Use role carefully. Use the role with the fewest privileges needed perform the UDF’s operations. | | body | String | | true | FQL function body. | | data | Object | | | Arbitrary user-defined metadata for the document. | ## [](#static-methods)Static methods You can use the following static methods to manage the `Function` collection in FQL. | Method | Description | | --- | --- | --- | --- | | Function() | Call a user-defined function (UDF) by its name. | | Function.all() | Get a Set of all user-defined functions (UDFs). | | Function.byName() | Get a user-defined function (UDF) by its name. | | Function.create() | Create a user-defined function (UDF). | | Function.firstWhere() | Get the first user-defined function (UDF) that matches a provided predicate. | | Function.toString() | Get "Function" as a String. | | Function.where() | Get a Set of user-defined functions (UDFs) that match a provided predicate. | ## [](#instance-properties)Instance properties `Function` documents have the following properties. You access the property using an existing UDF’s name. | Property | Description | | --- | --- | --- | --- | | function.definition | Get or update a user-defined function (UDF)'s definition, represented as a Function document. | ## [](#instance-methods)Instance methods You can use the following instance methods to manage function definitions, represented as `Function` documents, in FQL. You call the methods on a [FunctionDef](../../fql/types/#functiondef). | Method | Description | | --- | --- | --- | --- | | functionDef.delete() | Delete a user-defined function (UDF). | | functionDef.exists() | Test if a user-defined function (UDF) exists. | | functionDef.replace() | Replace a user-defined function (UDF). | | functionDef.update() | Update a user-defined function (UDF). | # `function.definition` | Learn: User-defined functions (UDFs) | | --- | --- | --- | Get or update a [user-defined function (UDF)](../../../../learn/schema/user-defined-functions/)'s definition, represented as a [`Function` document](../#collection). ## [](#signature)Signature ```fql-sig .definition: FunctionDef ``` ## [](#description)Description The `definition` property is a [user-defined functions (UDFs)](../../../../learn/schema/user-defined-functions/)'s schema, represented as a [`Function` document](../#collection) with the [FunctionDef](../../../fql/types/#functiondef) type. The document is an FQL version of the FSL [function schema](../../../fsl/function/). You access the property using an existing UDF’s name. ### [](#definition-properties)Definition properties You can use [dot or bracket notation](../../../fql/dot-notation/#dot-notation-field-accessor) to access specific fields in the definition. See [Access definition properties](#access). ### [](#definition-methods)Definition methods The `definition` property supports [function instance methods](../#instance-methods). ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | FunctionDef | Definition for the UDF, represented as a Function document. | ## [](#examples)Examples ### [](#basic-example)Basic example ```fql // Get the definition for // the `getOrCreateCart()` UDF. getOrCreateCart.definition ``` ``` { name: "getOrCreateCart", coll: Function, ts: Time("2099-10-03T20:30:59.360Z"), body: <<-END (id) => { let customer = Customer.byId(id)! if (customer!.cart == null) { Order.create({ status: "cart", customer: Customer.byId(id), createdAt: Time.now(), payment: { } }) } else { customer!.cart } } END } ``` ### [](#access)Access definition properties Use [dot or bracket notation](../../../fql/dot-notation/#dot-notation-field-accessor) to access specific fields in the definition: ```fql // Access the `body` field for // the `getOrCreateCart()` UDF. getOrCreateCart.definition.body ``` ``` // Only returns the `body` field. <<-END (id) => { let customer = Customer.byId(id)! if (customer!.cart == null) { Order.create({ status: "cart", customer: Customer.byId(id), createdAt: Time.now(), payment: { } }) } else { customer!.cart } } END ``` # `Function()` | Learn: User-defined functions (UDFs) | | --- | --- | --- | Call a [user-defined function (UDF)](../../../../learn/schema/user-defined-functions/) by its name. ## [](#signature)Signature ```fql-sig Function(function: String) => Any ``` ## [](#description)Description Calls a [UDF](../../../../learn/schema/user-defined-functions/) by its name. You can pass arguments to the UDF. See [Examples](#examples). ### [](#errors)Errors If you attempt to call a UDF that doesn’t exist, Fauna returns a query runtime error with an `invalid_argument` [error code](../../../http/reference/errors/) and a 400 HTTP status code: ``` invalid_argument error: invalid argument `function`: No such user function `Foo`. at *query*:1:9 | 1 | Function("Foo") | ^^^^^^^ | ``` ### [](#comparison-to-functionname)Comparison to `` Calling `Function()` is similar to accessing `` directly, except `Function()` returns an [Any](../../../fql/types/#any) value. This difference only affects static typing, not runtime behavior. In most cases, you should use ``. However, `Function()` is useful if you need to iterate through a list of UDF calls. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | function | String | true | Function to call | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Any | Results of the function call. | ## [](#examples)Examples Get the function by passing the function name to `Function()`: ```fql // Call function named `getOrCreateCart` Function("getOrCreateCart") ``` ``` "[function getOrCreateCart]" ``` To call the function, pass arguments in parentheses: ```fql // Calls the `getOrCreateCart()` function on // a `Customer` document with an `id` of `111`. Function("getOrCreateCart")('111') ``` ``` { id: "412998994633949261", coll: Order, ts: Time("2024-10-28T14:23:03.966Z"), items: "hdW...", total: 5392, status: "cart", customer: Customer("111"), createdAt: Time("2024-10-28T14:23:03.795981Z"), payment: {} } ``` # `Function.all()` | Learn: User-defined functions (UDFs) | | --- | --- | --- | Get a Set of all [user-defined functions (UDFs)](../../../../learn/schema/user-defined-functions/). ## [](#signature)Signature ```fql-sig Function.all() => Set Function.all(range: { from: Any } | { to: Any } | { from: Any, to: Any }) => Set ``` ## [](#description)Description Gets a Set containing all [UDFs](../../../../learn/schema/user-defined-functions/), represented as [`Function` documents](../), for the database. To limit the returned Set, you can provide an optional range. `Function` documents are FQL versions of a database’s FSL [function schema](../../../fsl/function/). `Function` documents have the [FunctionDef](../../../fql/types/#functiondef) type. See [User-defined functions (UDFs)](../../../../learn/schema/user-defined-functions/). If `Function.all()` is the last value in a query, the first page of the [Set](../../../fql/types/#set) is returned. See [Pagination](../../../../learn/query/pagination/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | range | { from: Any } | { to: Any } | { from: Any, to: Any } | | Specifies a range of Function documents in the form { from: start, to: end }.The Set only includes documents in this range (inclusive). Omit from or to to run unbounded range searches.If a range is omitted, all UDFs are returned. | ### [](#range-parameters)Range parameters | Name | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | from | Any | | Beginning of the range (inclusive). Must be an Function document. | | to | Any | | End of the range (inclusive). Must be an Function document. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set of Function documents in the provided range. If a range is omitted, all UDFs are returned.The Set is empty if:The database has no UDFs.There are no UDFs in the provided range.The provided range’s from value is greater than to. | ## [](#examples)Examples ```fql Function.all() ``` ``` { data: [ { name: "validateOrderStatusTransition", coll: Function, ts: Time("2024-10-25T17:49:28.145Z"), body: <<-END (oldStatus, newStatus) => { if (oldStatus == "cart" && newStatus != "processing") { abort("Invalid status transition.") } else if (oldStatus == "processing" && newStatus != "shipped") { abort("Invalid status transition.") } else if (oldStatus == "shipped" && newStatus != "delivered") { abort("Invalid status transition.") } } END }, ... ] } ``` # `Function.byName()` | Learn: User-defined functions (UDFs) | | --- | --- | --- | Get a [user-defined function (UDF)](../../../../learn/schema/user-defined-functions/) by its name. ## [](#signature)Signature ```fql-sig Function.byName(name: String) => NamedRef ``` ## [](#description)Description Gets a [UDF](../../../../learn/schema/user-defined-functions/), represented as an [`Function` document](../), by its name. `Function` documents are FQL versions of a database’s FSL [function schema](../../../fsl/function/). `Function` documents have the [FunctionDef](../../../fql/types/#functiondef) type. See [User-defined functions (UDFs)](../../../../learn/schema/user-defined-functions/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | name | String | true | name of the Function document to retrieve. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | NamedRef | Resolved reference to the Function document. Can resolve to an existing document or a NullDoc. | ## [](#examples)Examples ```fql Function.byName("validateOrderStatusTransition") ``` ``` { name: "validateOrderStatusTransition", coll: Function, ts: Time("2024-10-25T17:49:28.145Z"), body: <<-END (oldStatus, newStatus) => { if (oldStatus == "cart" && newStatus != "processing") { abort("Invalid status transition.") } else if (oldStatus == "processing" && newStatus != "shipped") { abort("Invalid status transition.") } else if (oldStatus == "shipped" && newStatus != "delivered") { abort("Invalid status transition.") } } END } ``` # `Function.create()` | Learn: User-defined functions (UDFs) | | --- | --- | --- | Create a [user-defined function (UDF)](../../../../learn/schema/user-defined-functions/). ## [](#signature)Signature ```fql-sig Function.create(data: { name: String, alias: String | Null, signature: String | Null, role: String | Null, body: String, data: { *: Any } | Null }) => FunctionDef ``` ## [](#description)Description Creates an [UDF](../../../../learn/schema/user-defined-functions/) with the provided document fields. Fauna stores UDFs as documents in the [`Function` system collection](../). `Function` documents are FQL versions of a database’s FSL [function schema](../../../fsl/function/). `Function` documents have the [FunctionDef](../../../fql/types/#functiondef) type. See [User-defined functions (UDFs)](../../../../learn/schema/user-defined-functions/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method adds a function to the staged schema, not the active schema. If the database has no staged schema, using this method is equivalent to making an [unstaged schema change](../../../../learn/schema/manage-schema/#unstaged). Changes are applied immediately to the database’s active schema. #### [](#concurrent)Avoid concurrent schema changes Concurrent [unstaged schema changes](../../../../learn/schema/manage-schema/#unstaged) can cause [contended transactions](../../../../learn/transactions/contention/), even if the changes affect different resources. This includes unstaged changes made using: * The [Fauna CLI](../../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../../learn/schema/manage-schema/#fql) A schema change triggers a [transaction that validates the entire database schema](../../../../learn/schema/#validation). To avoid errors, do one of the following instead: * Run [staged schema changes](../../../../learn/schema/manage-schema/#staged) * Perform unstaged schema changes sequentially ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | | Document fields for the new Function document.For supported document fields, see Function collection. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | FunctionDef | The new Function document. | ## [](#examples)Examples 1. Create a UDF: ```fql Function.create({ name: 'hello', body: '(x) => "Hello #{x}!"' }) ``` ``` { name: "hello", coll: Function, ts: Time("2099-10-25T17:56:30.150Z"), body: "(x) => \"Hello \#{x}!\"" } ``` 2. Call the UDF: ```fql hello("World") ``` ``` "Hello World!" ``` 1. Use heredoc syntax to avoid interpolation until the function executes: ```fql Function.create({ name: "hello2", body: <<-EOB name => "Hello to you, #{name}!" EOB }) ``` ``` { name: "hello2", coll: Function, ts: Time("2099-11-03T17:06:02.790Z"), body: <<-END name => "Hello to you, #{name}!" END } ``` 2. Call the UDF: ```fql hello2("World") ``` ``` "Hello to you, World!" ``` Create a UDF with the `admin` role: ```fql Function.create({ name: 'Doublex', body: '(x) => x + x', role: 'admin', }) ``` ``` { name: "Doublex", coll: Function, ts: Time("2099-06-25T15:03:14.060Z"), body: "(x) => x + x", role: "admin" } ``` Create a UDF with a `data` metadata field: ```fql Function.create({ name: "square", body: "x => x * x", data: { version: "1.0" } }) ``` ``` { name: "square", coll: Function, ts: Time("2099-01-08T20:30:50.090Z"), body: "x => x * x", data: { version: "1.0" } } ``` # `Function.firstWhere()` | Learn: User-defined functions (UDFs) | | --- | --- | --- | Get the first [user-defined function (UDF)](../../../../learn/schema/user-defined-functions/) that matches a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig Function.firstWhere(pred: (FunctionDef => Boolean)) => FunctionDef | Null ``` ## [](#description)Description Gets the first [UDF](../../../../learn/schema/user-defined-functions/), represented as an [`Function` document](../), that matches a provided [predicate function](../../../fql/functions/#predicates). `Function` documents are FQL versions of a database’s FSL [function schema](../../../fsl/function/). `Function` documents have the [FunctionDef](../../../fql/types/#functiondef) type. See [User-defined functions (UDFs)](../../../../learn/schema/user-defined-functions/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pred | Predicate function | Yes | Anonymous predicate function that:Accepts a Function document as its only argument. Supports shorthand-syntax.Returns a Boolean value.The method returns the first Function document for which the predicate returns true. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | FunctionDef | First Function document that matches the predicate. | | Null | No Function document matches the predicate. | ## [](#examples)Examples Get the first UDF when at least one matching UDF document exists and is accessible: ```fql Function.firstWhere(.name.includes('validate')) ``` ``` { name: "validateOrderStatusTransition", coll: Function, ts: Time("2099-10-25T17:49:28.145Z"), body: <<-END (oldStatus, newStatus) => { if (oldStatus == "cart" && newStatus != "processing") { abort("Invalid status transition.") } else if (oldStatus == "processing" && newStatus != "shipped") { abort("Invalid status transition.") } else if (oldStatus == "shipped" && newStatus != "delivered") { abort("Invalid status transition.") } } END } ``` # `Function.toString()` | Learn: User-defined functions (UDFs) | | --- | --- | --- | Get `"Function"` as a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig Function.toString() => String ``` ## [](#description)Description Returns the name of the [`Function` collection](../) as a [String](../../../fql/types/#string). ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | "Function" | ## [](#examples)Examples ```fql Function.toString() ``` ``` "Function" ``` # `Function.where()` | Learn: User-defined functions (UDFs) | | --- | --- | --- | Get a Set of [user-defined functions (UDFs)](../../../../learn/schema/user-defined-functions/) that match a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig Function.where(pred: (FunctionDef => Boolean)) => Set ``` ## [](#description)Description Gets a Set of [UDFs](../../../../learn/schema/user-defined-functions/), represented as [`Function` documents](../), that match a provided [predicate function](../../../fql/functions/#predicates). `Function` documents are FQL versions of a database’s FSL [function schema](../../../fsl/function/). `Function` documents have the [FunctionDef](../../../fql/types/#functiondef) type. See [User-defined functions (UDFs)](../../../../learn/schema/user-defined-functions/). If `Function.where()` is the last expression in a query, the first page of the `Set` is returned. See [Pagination](../../../../learn/query/pagination/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pred | Predicate function | Yes | Anonymous predicate function that:Accepts an Function document as its only argument. Supports shorthand-syntax.Returns a Boolean value.The method returns a Set of Function documents for which the predicate returns true. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set of Function documents that match the predicate. If there are no matching documents, the Set is empty. | ## [](#examples)Examples ```fql Function.where(.name.includes('validate')) ``` ``` { data: [ { name: "validateOrderStatusTransition", coll: Function, ts: Time("2099-10-25T17:49:28.145Z"), body: <<-END (oldStatus, newStatus) => { if (oldStatus == "cart" && newStatus != "processing") { abort("Invalid status transition.") } else if (oldStatus == "processing" && newStatus != "shipped") { abort("Invalid status transition.") } else if (oldStatus == "shipped" && newStatus != "delivered") { abort("Invalid status transition.") } } END } ] } ``` # `functionDef.delete()` | Learn: User-defined functions (UDFs) | | --- | --- | --- | Delete a [user-defined function (UDF)](../../../../learn/schema/user-defined-functions/). ## [](#signature)Signature ```fql-sig delete() => NullFunctionDef ``` ## [](#description)Description Deletes a [user-defined function (UDF)](../../../../learn/schema/user-defined-functions/), represented as a [`Function` document](../). `Function` documents are FQL versions of a database’s FSL [function schema](../../../fsl/function/). `Function` documents have the [FunctionDef](../../../fql/types/#functiondef) type. See [User-defined functions (UDFs)](../../../../learn/schema/user-defined-functions/). ### [](#staged-schema)Staged schema You can’t delete a function while a database has [staged schema](../../../../learn/schema/manage-schema/#staged). If the database has no staged schema, using this method is equivalent to making an [unstaged schema change](../../../../learn/schema/manage-schema/#unstaged). Changes are applied immediately to the database’s active schema. #### [](#concurrent)Avoid concurrent schema changes Concurrent [unstaged schema changes](../../../../learn/schema/manage-schema/#unstaged) can cause [contended transactions](../../../../learn/transactions/contention/), even if the changes affect different resources. This includes unstaged changes made using: * The [Fauna CLI](../../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../../learn/schema/manage-schema/#fql) A schema change triggers a [transaction that validates the entire database schema](../../../../learn/schema/#validation). To avoid errors, do one of the following instead: * Run [staged schema changes](../../../../learn/schema/manage-schema/#staged) * Perform unstaged schema changes sequentially ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | NullFunctionDef | Document doesn’t exist. See NullDoc. | ## [](#examples)Examples ```fql Function.byName("checkout")?.delete() ``` ``` Function.byName("checkout") /* deleted */ ``` # `functionDef.exists()` | Learn: User-defined functions (UDFs) | | --- | --- | --- | Test if a [user-defined function (UDF)](../../../../learn/schema/user-defined-functions/) exists. ## [](#signature)Signature ```fql-sig exists() => Boolean ``` ## [](#description)Description Tests if an [UDF](../../../../learn/schema/user-defined-functions/), represented as an [`Function` document](../), exists. Fauna stores UDFs as documents in the [`Function` system collection](../). `Function` documents are FQL versions of a database’s FSL [function schema](../../../fsl/function/). `Function` documents have the [FunctionDef](../../../fql/types/#functiondef) type. See [User-defined functions (UDFs)](../../../../learn/schema/user-defined-functions/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. ### [](#exists-vs-null-comparisons)`exists()` vs. null comparisons You can use either `exists()` or a null comparison (`== null` or `!= null`) to check the existence or validity of a value. For example: ```fql Function.byName("createOrUpdateCartItem").exists() // true Function.byName("createOrUpdateCartItem") != null // true ``` Key differences: * `exists()` returns an error if called on an unsupported value. * Null comparisons do not throw errors and work safely on any value. For example: ```fql // Declare an object. Objects don't support // an `exists()` method. let object = { a: "Foo", b: "Bar" } object.exists() // Returns `invalid_query` error object != null // Returns true ``` ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the Function document exists. If false, the Function document doesn’t exist. | ## [](#examples)Examples ```fql Function.byName("checkout").exists() ``` ``` true ``` # `functionDef.replace()` | Learn: User-defined functions (UDFs) | | --- | --- | --- | Replace a [user-defined function (UDF)](../../../../learn/schema/user-defined-functions/). ## [](#signature)Signature ```fql-sig replace(data: { *: Any }) => FunctionDef ``` ## [](#description)Description Replaces all fields in a [UDF](../../../../learn/schema/user-defined-functions/), represented as an [`Function` document](../), with fields from a provided data object. Fields not present in the data object, excluding the `coll` and `ts` metadata fields, are removed. `Function` documents are FQL versions of a database’s FSL [function schema](../../../fsl/function/). `Function` documents have the [FunctionDef](../../../fql/types/#functiondef) type. See [User-defined functions (UDFs)](../../../../learn/schema/user-defined-functions/). ### [](#metadata-fields)Metadata fields You can’t use this method to insert or edit the following [metadata fields](../../../../learn/data-model/documents/#meta): * `coll` * `ts` ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. You can’t rename a function while a database has staged schema. If the database has no staged schema, using this method is equivalent to making an [unstaged schema change](../../../../learn/schema/manage-schema/#unstaged). Changes are applied immediately to the database’s active schema. #### [](#concurrent)Avoid concurrent schema changes Concurrent [unstaged schema changes](../../../../learn/schema/manage-schema/#unstaged) can cause [contended transactions](../../../../learn/transactions/contention/), even if the changes affect different resources. This includes unstaged changes made using: * The [Fauna CLI](../../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../../learn/schema/manage-schema/#fql) A schema change triggers a [transaction that validates the entire database schema](../../../../learn/schema/#validation). To avoid errors, do one of the following instead: * Run [staged schema changes](../../../../learn/schema/manage-schema/#staged) * Perform unstaged schema changes sequentially ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | true | Fields for the Function document. Fields not present, excluding the coll and ts metadata fields, in the object are removed.For supported document fields, see Function collection.The object can’t include the following metadata fields:* coll * ts | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | FunctionDef | Function document with replaced fields. | ## [](#examples)Examples ```fql Function.byName("validateOrderStatusTransition")?.replace({ name: "validateOrderStatusTransition", role: "server", body: <<-END (oldStatus, newStatus) => { if (oldStatus == "cart" && newStatus != "processing") { abort("Invalid status transition.") } else if (oldStatus == "processing" && newStatus != "shipped") { abort("Invalid status transition.") } else if (oldStatus == "shipped" && newStatus != "delivered") { abort("Invalid status transition.") } } END }) ``` ``` { name: "validateOrderStatusTransition", coll: Function, ts: Time("2099-10-28T15:11:25.460Z"), body: <<-END (oldStatus, newStatus) => { if (oldStatus == "cart" && newStatus != "processing") { abort("Invalid status transition.") } else if (oldStatus == "processing" && newStatus != "shipped") { abort("Invalid status transition.") } else if (oldStatus == "shipped" && newStatus != "delivered") { abort("Invalid status transition.") } } END, role: "server" } ``` # `functionDef.update()` | Learn: User-defined functions (UDFs) | | --- | --- | --- | Update a [user-defined function (UDF)](../../../../learn/schema/user-defined-functions/). ## [](#signature)Signature ```fql-sig update(data: { *: Any }) => FunctionDef ``` ## [](#description)Description Updates a [UDF](../../../../learn/schema/user-defined-functions/), represented as an [`Function` document](../), with fields from a provided data object. During the update, fields from the data object are copied to the document, creating new fields or updating existing fields. The operation is similar to a merge. `Function` documents are FQL versions of a database’s FSL [function schema](../../../fsl/function/). `Function` documents have the [FunctionDef](../../../fql/types/#functiondef) type. See [User-defined functions (UDFs)](../../../../learn/schema/user-defined-functions/). ### [](#nested-fields)Nested fields Fields with nested objects in the data object are merged with the identically named nested object in the document. ### [](#remove-a-field)Remove a field To remove a document field, set its value in the data object to `null`. ### [](#metadata-fields)Metadata fields You can’t use this method to insert or edit the following [metadata fields](../../../../learn/data-model/documents/#meta): * `coll` * `ts` ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. You can’t rename a function while a database has staged schema. If the database has no staged schema, using this method is equivalent to making an [unstaged schema change](../../../../learn/schema/manage-schema/#unstaged). Changes are applied immediately to the database’s active schema. #### [](#concurrent)Avoid concurrent schema changes Concurrent [unstaged schema changes](../../../../learn/schema/manage-schema/#unstaged) can cause [contended transactions](../../../../learn/transactions/contention/), even if the changes affect different resources. This includes unstaged changes made using: * The [Fauna CLI](../../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../../learn/schema/manage-schema/#fql) A schema change triggers a [transaction that validates the entire database schema](../../../../learn/schema/#validation). To avoid errors, do one of the following instead: * Run [staged schema changes](../../../../learn/schema/manage-schema/#staged) * Perform unstaged schema changes sequentially ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | true | Document fields for the Function document.For supported document fields, see Function collection.The object can’t include the following metadata fields:* coll * ts | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | FunctionDef | The updated Function document. | ## [](#examples)Examples ```fql Function.byName("validateOrderStatusTransition")?.update({ name: "validateOrderStatusTransition", body: <<-END (oldStatus, newStatus) => { if (oldStatus == "cart" && newStatus != "processing") { abort("Invalid status transition.") } else if (oldStatus == "processing" && newStatus != "shipped") { abort("Invalid status transition.") } else if (oldStatus == "shipped" && newStatus != "delivered") { abort("Invalid status transition.") } } END, role: "server" }) ``` ``` { name: "validateOrderStatusTransition", coll: Function, ts: Time("2099-10-25T18:16:27.725Z"), body: <<-END (oldStatus, newStatus) => { if (oldStatus == "cart" && newStatus != "processing") { abort("Invalid status transition.") } else if (oldStatus == "processing" && newStatus != "shipped") { abort("Invalid status transition.") } else if (oldStatus == "shipped" && newStatus != "delivered") { abort("Invalid status transition.") } } END, role: "server" } ``` # Key | Learn: Keys | | --- | --- | --- | A [key](../../../learn/security/keys/) is a type of [authentication secret](../../../learn/security/authentication/#secrets) used for anonymous access to a Fauna database. Unlike [tokens](../../../learn/security/tokens/), keys are not associated with an identity. ## [](#collection)`Key` collection Fauna stores keys scoped to a database as documents in the database’s `Key` system collection. `Key` documents have the following FQL structure: ``` { id: "371460335192768546", coll: Key, ts: Time("2099-07-28T02:23:51.300Z"), ttl: Time("2099-07-29T02:23:51.189192Z"), role: "admin", database: "child_db", data: { name: "System-generated dashboard key" }, secret: "fn..." } ``` | Field name | Type | Read-only | Required | Description | | --- | --- | --- | --- | --- | --- | --- | | id | ID | | | ID for the Key document. The ID is a string-encoded, 64-bit unsigned integer in the 253-1 range. The ID is unique within the collection.IDs are assigned at document creation. To create a key with a user-provided id using Key.create(), you must use a secret with the create_with_id privilege for the Key collection. If not provided, Fauna generates the id. | | coll | Collection | true | | Collection name: Key. | | ts | Long | true | | Last time the document was created or updated. | | role | String | | true | Role assigned to the key. Can be a user-defined role or one of the following built-in roles:adminserverserver-readonlyIf you specify a user-defined role and a child database, the role must be defined in the specified child database. | | database | String | Null | | | Child database to which the key is scoped. The child database must be directly nested under the database scoped to query’s authentication secret.If not present, the key is scoped to the same database as the authentication secret. | | ttl | Time | | | Time-to-live (TTL) for the document. Only present if set. If not present or set to null, the document persists indefinitely. | | data | { *: Any } | Null | | | Arbitrary user-defined metadata for the document. | | secret | String | | | The secret is a randomly generated cryptographic hash. This field isn’t stored in the document. The secret is only accessible in the Key.create() return. A caller obtains the secret from this return and stores it for subsequent queries. Fauna can’t recover a discarded or lost secret. | ## [](#static-methods)Static methods You can use the following static methods to manage the `Key` collection in FQL. | Method | Description | | --- | --- | --- | --- | | Key.all() | Get a Set of all keys. | | Key.byId() | Get a key by its document id. | | Key.create() | Create a key. | | Key.firstWhere() | Get the first key that matches a provided predicate. | | Key.toString() | Get "Key" as a String. | | Key.where() | Get a Set of keys that match a provided predicate. | ## [](#instance-methods)Instance methods You can use the following instance methods to manage specific `Key` documents in FQL. | Method | Description | | --- | --- | --- | --- | | key.delete() | Delete a key. | | key.exists() | Test if a key exists. | | key.replace() | Replace a key. | | key.update() | Update a key. | ## [](#dashboard)Dashboard-created keys The [Fauna Dashboard](https://dashboard.fauna.com/) automatically creates a temporary key when you: * Log in to the Dashboard. This key has the built-in `admin` role. * Use the Dashboard Shell’s authentication drop-down to run a query using a role other than **Admin**. ![Run a query as a role](../../../learn/_images/run-as-role.png) Dashboard-created keys have a 15-minute [`ttl` (time-to-live)](../../../learn/security/keys/#ttl) and are scoped to their specific database. Related `Key` documents include a `data` field with related metadata: ``` { id: "414467050449141793", coll: Key, ts: Time("2099-11-13T19:17:11.020Z"), ttl: Time("2099-11-13T19:32:09.915Z"), data: { name: "System-generated dashboard key" }, role: "admin" } ``` The Dashboard surfaces this metadata in the database’s **Keys** tab on the **Explorer** page. ![Key’s tab in the Fauna Dashboard](../../../learn/_images/keys-tab.png) # `Key.all()` | Learn: Keys | | --- | --- | --- | Get a Set of all [keys](../../../../learn/security/keys/). ## [](#signature)Signature ```fql-sig Key.all() => Set Key.all(range: { from: Any } | { to: Any } | { from: Any, to: Any }) => Set ``` ## [](#description)Description Gets a Set containing all [keys](../../../../learn/security/keys/), represented as [`Key` documents](../), for the database. To limit the returned Set, you can provide an optional range. If `Key.all()` is the last value in a query, the first page of the [Set](../../../fql/types/#set) is returned. See [Pagination](../../../../learn/query/pagination/). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | range | { from: Any } | { to: Any } | { from: Any, to: Any } | | Specifies a range of Key documents in the form { from: start, to: end }. from and to arguments should be in the order returned by an unbounded Key.all() call. See Range examples.The Set only includes documents in this range (inclusive). Omit from or to to run unbounded range searches.If a range is omitted, all Key documents are returned. | ### [](#range-parameters)Range parameters | Name | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | from | Any | | Beginning of the range (inclusive). Must be an Key document. | | to | Any | | End of the range (inclusive). Must be an Key document. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set of Key documents in the provided range. If a range is omitted, all Key documents are returned.The Set is empty if:The database has no keys.There are no keys in the provided range.The provided range’s from value is greater than to. | ## [](#examples)Examples ### [](#range)Range examples 1. Get all keys for the database: ```fql Key.all() ``` ``` { data: [ { id: "111", coll: Key, ts: Time("2099-07-19T20:53:15.250Z"), role: "admin", data: { desc: "Admin key for prod app database" } }, { id: "222", coll: Key, ts: Time("2099-07-19T20:53:15.250Z"), role: "server", data: { desc: "Server key for prod app database" } }, { id: "333", coll: Key, ts: Time("2099-07-19T20:53:15.250Z"), role: "server-readonly", data: { desc: "Server-readonly key for prod app database" } } ] } ``` 2. Get a Set of `Key` documents from ID `111` (inclusive) to ID `222` (inclusive): ```fql Key.all({ from: Key.byId("111"), to: Key.byId("222") }) ``` ``` { data: [ { id: "111", coll: Key, ts: Time("2099-07-19T20:53:15.250Z"), role: "admin", data: { desc: "Admin key for prod app database" } }, { id: "222", coll: Key, ts: Time("2099-07-19T20:53:15.250Z"), role: "server", data: { desc: "Server key for prod app database" } } ] } ``` 3. Get a Set of keys up to ID `222` (inclusive): ```fql Key.all({ to: Key.byId("222") }) ``` ``` { data: [ { id: "111", coll: Key, ts: Time("2099-07-19T20:53:15.250Z"), role: "admin", data: { desc: "Admin key for prod app database" } }, { id: "222", coll: Key, ts: Time("2099-07-19T20:53:15.250Z"), role: "server", data: { desc: "Server key for prod app database" } } ] } ``` # `Key.byId()` | Learn: Keys | | --- | --- | --- | Get a [key](../../../../learn/security/keys/) by its [document `id`](../../../../learn/data-model/documents/#meta). ## [](#signature)Signature ```fql-sig Key.byId(id: ID) => Ref ``` ## [](#description)Description Gets a [key](../../../../learn/security/keys/), represented as an [`Key` document](../), by its [document `id`](../../../../learn/data-model/documents/#meta). A key is a type of [authentication secret](../../../../learn/security/authentication/#secrets) used for anonymous access to a Fauna database. Unlike [tokens](../../../../learn/security/tokens/), keys are not associated with an identity. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | id | String | true | ID of the Key document to retrieve. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Ref | Resolved reference to the Key document. Can resolve to an existing document or a NullDoc. | ## [](#examples)Examples ```fql Key.byId("412655134325080576") ``` ``` { id: "412655134325080576", coll: Key, ts: Time("2099-08-01T02:45:50.490Z"), role: "admin", data: { desc: "Admin key for prod app database" } } ``` # `Key.create()` | Learn: Keys | | --- | --- | --- | Create a [key](../../../../learn/security/keys/). ## [](#signature)Signature ```fql-sig Key.create(data: { role: String, database: String | Null, ttl: Time | Null, data: { *: Any } | Null }) => Key ``` ## [](#description)Description Creates a [key](../../../../learn/security/keys/) with the provided document fields. Fauna stores keys as documents in the [`Key` system collection](../). A key is a type of [authentication secret](../../../../learn/security/authentication/#secrets) used for anonymous access to a Fauna database. Unlike [tokens](../../../../learn/security/tokens/), keys are not associated with an identity. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | true | Document fields for the new Key document.For supported document fields, see Key collection. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Key | The new Key document. Includes the key’s secret, which you can use to authenticate with Fauna.A key’s secret is shown once — when you create the key. You can’t recover or regenerate a lost key secret. Instead, delete the key and create a new one. | ## [](#examples)Examples Create a key with a user-defined role and which expires tomorrow: ```fql Key.create({role: "admin", ttl: Time.now().add(1, "day")}) ``` ``` { id: "412655134325080576", coll: Key, ts: Time("2099-07-28T02:23:51.300Z"), ttl: Time("2099-07-29T02:23:51.189192Z"), secret: "fn...", role: "admin" } ``` # `Key.firstWhere()` | Learn: Keys | | --- | --- | --- | Get the first [key](../../../../learn/security/keys/) that matches a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig Key.firstWhere(pred: (Key => Boolean)) => Key | Null ``` ## [](#description)Description Gets the first [key](../../../../learn/security/keys/), represented as a [`Key` document](../), that matches a provided [predicate function](../../../fql/functions/#predicates). A key is a type of [authentication secret](../../../../learn/security/authentication/#secrets) used for anonymous access to a Fauna database. Unlike [tokens](../../../../learn/security/tokens/), keys are not associated with an identity. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pred | Predicate function | Yes | Anonymous predicate function that:Accepts a Key document as its only argument. Supports shorthand-syntax.Returns a Boolean value.The method returns the first Key document for which the predicate returns true. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Key | First Key document that matches the predicate. | | Null | No Key document matches the predicate. | ## [](#examples)Examples ```fql Key.firstWhere(.role == "admin") ``` ``` { id: "412655134325080576", coll: Key, ts: Time("2099-07-19T20:53:15.250Z"), role: "admin", data: { desc: "Admin key for prod app database" } } ``` # `Key.toString()` | Learn: Keys | | --- | --- | --- | Get `"Key"` as a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig Key.toString() => String ``` ## [](#description)Description Returns the name of the [`Key` collection](../) as a [String](../../../fql/types/#string). ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | "Key" | ## [](#examples)Examples ```fql Key.toString() ``` ``` "Key" ``` # `Key.where()` | Learn: Keys | | --- | --- | --- | Get a Set of [keys](../../../../learn/security/keys/) that match a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig Key.where(pred: (Key => Boolean)) => Set ``` ## [](#description)Description Gets a Set of [keys](../../../../learn/security/keys/), represented as [`Key` documents](../), that match a provided [predicate function](../../../fql/functions/#predicates). A key is a type of [authentication secret](../../../../learn/security/authentication/#secrets) used for anonymous access to a Fauna database. Unlike [tokens](../../../../learn/security/tokens/), keys are not associated with an identity. If `Key.where()` is the last expression in a query, the first page of the `Set` is returned. See [Pagination](../../../../learn/query/pagination/). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pred | Predicate function | Yes | Anonymous predicate function that:Accepts an Key document as its only argument. Supports shorthand-syntax.Returns a Boolean value.The method returns a Set of Key documents for which the predicate returns true. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set of Key documents that match the predicate. If there are no matching documents, the Set is empty. | ## [](#examples)Examples ```fql Key.where(.data != null) ``` ``` { data: [ { id: "412655134325080576", coll: Key, ts: Time("2099-07-19T20:53:15.250Z"), role: "admin", data: { desc: "Admin key for prod app database" } }, { id: "412655134325080577", coll: Key, ts: Time("2099-07-28T02:25:35.050Z"), role: "server", data: { desc: "Server key for prod app database" } } ] } ``` # `key.delete()` | Learn: Keys | | --- | --- | --- | Delete a [key](../../../../learn/security/keys/). ## [](#signature)Signature ```fql-sig delete() => NullKey ``` ## [](#description)Description Deletes a [key](../../../../learn/security/keys/), represented as a [`Key` document](../). A key is a type of [authentication secret](../../../../learn/security/authentication/#secrets) used for anonymous access to a Fauna database. Unlike [tokens](../../../../learn/security/tokens/), keys are not associated with an identity. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | NullKey | Document doesn’t exist. See NullDoc. | ## [](#examples)Examples ### [](#basic-ex)Basic example ```fql Key.byId("412655134325080576")!.delete() ``` ``` Key("412655134325080576") /* deleted */ ``` ### [](#delete-all-keys)Delete all keys To [avoid throttling](../../../../learn/data-model/databases/#delete-cons), you can incrementally delete all keys for a database before deleting the database itself. To stay within [transaction size limits](../../../requirements-limits/#glimits), use [`set.paginate()`](../../set/paginate/) to perform the deletions over several queries instead of one. ```fql // Gets all `Key` system collection documents. // Uses `pageSize()` to limit the page size. // Uses `paginate()` to project the after cursor. let page = Key.all().pageSize(200).paginate() // `paginate()` returns an object. The object's `data` property // contains an Array of `Key` documents. let data = page.data // Use `forEach()` to delete each `Key` document in the // `data` Array. data.forEach(doc => doc.delete()) // Project the `after` cursor returned by `paginate()`. // Use the cursor to iterate through the remaining pages. page { after } ``` ``` { after: "hdWDxoq..." } ``` Subsequent queries use the cursor and [`Set.paginate()`](../../set/static-paginate/) to iterate through the remaining pages: ```fql // Uses `Set.paginate()` to iterate through pages. let page = Set.paginate("hdW...") let data = page.data data.forEach(doc => doc.delete()) page { after } ``` # `key.exists()` | Learn: Keys | | --- | --- | --- | Test if a [key](../../../../learn/security/keys/) exists. ## [](#signature)Signature ```fql-sig exists() => Boolean ``` ## [](#description)Description Tests if an [key](../../../../learn/security/keys/), represented as an [`Key` document](../), exists. A key is a type of [authentication secret](../../../../learn/security/authentication/#secrets) used for anonymous access to a Fauna database. Unlike [tokens](../../../../learn/security/tokens/), keys are not associated with an identity. ### [](#exists-vs-null-comparisons)`exists()` vs. null comparisons You can use either `exists()` or a null comparison (`== null` or `!= null`) to check the existence or validity of a value. For example: ```fql Key.byId("12345").exists() // true Key.byId("12345") != null // true ``` Key differences: * `exists()` returns an error if called on an unsupported value. * Null comparisons do not throw errors and work safely on any value. For example: ```fql // Declare an object. Objects don't support // an `exists()` method. let object = { a: "Foo", b: "Bar" } object.exists() // Returns `invalid_query` error object != null // Returns true ``` ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the Key document exists. If false, the Key document doesn’t exist. | ## [](#examples)Examples ```fql Key.byId("412655134325080576").exists() ``` ``` true ``` # `key.replace()` | Learn: Keys | | --- | --- | --- | Replace a [key](../../../../learn/security/keys/). ## [](#signature)Signature ```fql-sig replace(data: { *: Any }) => Key ``` ## [](#description)Description Replaces all fields in a [key](../../../../learn/security/keys/), represented as an [`Key` document](../), with fields from a provided data object. Fields not present in the data object, excluding the `id`, `coll`, and `ts` metadata fields, are removed. A key is a type of [authentication secret](../../../../learn/security/authentication/#secrets) used for anonymous access to a Fauna database. Unlike [tokens](../../../../learn/security/tokens/), keys are not associated with an identity. ### [](#metadata-fields)Metadata fields You can’t use this method to insert or edit the following [metadata fields](../../../../learn/data-model/documents/#meta): * `id` * `coll` * `ts` ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | true | Fields for the Key document. Fields not present, excluding the id, coll, and ts metadata fields, in the object are removed.For supported document fields, see Key collection.The object can’t include the following metadata fields:* id * coll * ts | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Key | Key document with replaced fields. | ## [](#examples)Examples ```fql Key.byId("412655134325080576")!.replace({ role: "server", data: { desc: "Server key for prod app database" } }) ``` ``` { id: "412655134325080576", coll: Key, ts: Time("2099-07-28T02:25:07.910Z"), role: "server", data: { desc: "Server key for prod app database" } } ``` # `key.update()` | Learn: Keys | | --- | --- | --- | Update a [key](../../../../learn/security/keys/). ## [](#signature)Signature ```fql-sig update(data: { *: Any }) => Key ``` ## [](#description)Description Updates a [key](../../../../learn/security/keys/), represented as an [`Key` document](../), with fields from a provided data object. During the update, fields from the data object are copied to the document, creating new fields or updating existing fields. The operation is similar to a merge. ### [](#nested-fields)Nested fields Fields with nested objects in the data object are merged with the identically named nested object in the document. ### [](#remove-a-field)Remove a field To remove a document field, set its value in the data object to `null`. ### [](#metadata-fields)Metadata fields You can’t use this method to insert or edit the following [metadata fields](../../../../learn/data-model/documents/#meta): * `id` * `coll` * `ts` ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | true | Document fields for the Key document.For supported document fields, see Key collection.The object can’t include the following metadata fields:* id * coll * ts | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Key | The updated Key document. | ## [](#examples)Examples ```fql Key.byId("412655134325080576")!.update({ data: { desc: "Admin key for prod app database" } }) ``` ``` { id: "412655134325080576", coll: Key, ts: Time("2099-07-11T14:17:49.890Z"), role: "admin", data: { desc: "Admin key for prod app database" } } ``` # Math `Math` methods and properties. ## [](#description)Description `Math` is a built-in object that has methods for performing mathematical operations. ## [](#properties)Properties | Property | Description | | --- | --- | --- | --- | | Math.E | Get the Euler’s number mathematical constant (℮). | | Math.Infinity | String value representing infinity. | | Math.NaN | Value representing Not-a-Number. | | Math.PI | Get the mathematical constant pi (π). | ## [](#static-methods)Static methods | Method | Description | | --- | --- | --- | --- | | Math.abs() | Get the absolute value of a Number. | | Math.acos() | Get the inverse cosine in radians of a Number. | | Math.asin() | Get the inverse sine in radians of a Number. | | Math.atan() | Get the inverse tangent in radians of a Number. | | Math.ceil() | Round up a Number. | | Math.cos() | Get the cosine of a Number in radians. | | Math.cosh() | Get the hyperbolic cosine of a Number. | | Math.degrees() | Convert radians to degrees. | | Math.exp() | Get the value of ℮ raised to the power of a Number. | | Math.floor() | Round down a Number. | | Math.hypot() | Get the hypotenuse of a right triangle. | | Math.log() | Get the natural logarithm, base e, of a Number. | | Math.log10() | Get the base 10 logarithm of a Number. | | Math.max() | Get the larger of two Numbers. | | Math.mean() | Get the arithmetic mean of an Array or Set of Numbers. | | Math.min() | Get the smaller of the input parameter Numbers. | | Math.pow() | Get the value of a base raised to a power. | | Math.radians() | Convert the value of a Number in degrees to radians. | | Math.round() | Get the value of a Number rounded to the nearest integer. | | Math.sign() | Get the sign of a Number. | | Math.sin() | Get the sine of a Number in radians. | | Math.sinh() | Get the hyperbolic sine of a Number. | | Math.sqrt() | Get the square root of a Number. | | Math.sum() | Get the sum of an Array or Set of Numbers. | | Math.tan() | Get the tangent of a Number in radians. | | Math.tanh() | Get the hyperbolic tangent of a Number. | | Math.trunc() | Truncate a Number to a given precision. | # `Math.E` Get the Euler’s number mathematical constant (℮). ## [](#signature)Signature ```fql-sig Math.E: Number ``` ## [](#description)Description Returns Euler’s constant. ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Value of Euler’s constant. | ## [](#examples)Examples ```fql Math.E ``` ``` 2.718281828459045 ``` # `Math.Infinity` String value representing infinity. ## [](#signature)Signature ```fql-sig Math.Infinity: Number ``` ## [](#description)Description `Math.Infinity` represents a double precision floating point number that is greater than any other number. `-Math.Infinity` represents a floating point number that is less than any other number. See [IEEE 754](https://en.wikipedia.org/wiki/IEEE_754). ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | Math.Infinity or -Math.Infinity | ## [](#examples)Examples ```fql "1.7976931348623159e308".parseNumber() ``` ``` Math.Infinity ``` ```fql Math.Infinity * -1.0 ``` ``` -Math.Infinity ``` # `Math.NaN` Value representing Not-a-Number. ## [](#signature)Signature ```fql-sig Math.NaN: Number ``` ## [](#description)Description NaN stands for Not-a-Number and `Math.NaN` represents a double precision floating point number that is undefined or can’t be represented. See [IEEE 754](https://en.wikipedia.org/wiki/IEEE_754) and [NaN](https://en.wikipedia.org/wiki/NaN). ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | Math.NaN | ## [](#examples)Examples ```fql 0.0 / 0.0 ``` ``` Math.NaN ``` # `Math.PI` Get the mathematical constant pi (π). ## [](#signature)Signature ```fql-sig Math.PI: Number ``` ## [](#description)Description Gets the mathematical constant pi, which is the ratio of the circumference of a circle to its diameter. ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Value of pi. | ## [](#examples)Examples ```fql let radius = 5 Math.PI * (radius + radius) ``` ``` 31.41592653589793 ``` # `Math.abs()` Get the absolute value of a [Number](../../../fql/types/#number). ## [](#signature)Signature ```fql-sig Math.abs(x: Number) => Number ``` ## [](#description)Description The `Math.abs()` method returns the absolute value of a provided [Number](../../../fql/types/#number). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | Number for which you want the absolute value. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Absolute value of the provided number. | ## [](#examples)Examples 1. Get the absolute value of an integer: ```fql Math.abs(5) ``` ``` 5 ``` 2. Get the absolute value of a negative integer: ```fql Math.abs(-5) ``` ``` 5 ``` 3. Get the absolution value of a real number: ```fql Math.abs(-5.1) ``` ``` 5.1 ``` # `Math.acos()` Get the inverse cosine in radians of a [Number](../../../fql/types/#number). ## [](#signature)Signature ```fql-sig Math.acos(x: Number) => Number ``` ## [](#description)Description Gets the inverse cosine in radians of a [Number](../../../fql/types/#number). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | A number. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Inverse cosine in radians of the provided number. | ## [](#examples)Examples ```fql Math.acos(0.5) ``` ``` 1.0471975511965979 ``` # `Math.asin()` Get the inverse sine in radians of a [Number](../../../fql/types/#number). ## [](#signature)Signature ```fql-sig Math.asin(x: Number) => Number ``` ## [](#description)Description Return the inverse sine in radians of a [Number](../../../fql/types/#number). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | A number. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Inverse sine in radians of the provided number. | ## [](#examples)Examples ```fql Math.asin(0.5) ``` ``` 0.5235987755982989 ``` # `Math.atan()` Get the inverse tangent in radians of a [Number](../../../fql/types/#number). ## [](#signature)Signature ```fql-sig Math.atan(x: Number) => Number ``` ## [](#description)Description Gets the inverse tangent in radians of a [Number](../../../fql/types/#number). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | A number. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Inverse tangent in radians of the provided number. | ## [](#examples)Examples ```fql Math.atan(1) ``` ``` 0.7853981633974483 ``` # `Math.ceil()` Round up a [Number](../../../fql/types/#number). ## [](#signature)Signature ```fql-sig Math.ceil(x: Number) => Number ``` ## [](#description)Description Gets the value of a provided [Number](../../../fql/types/#number) rounded up. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | Number to round up. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Value of the provided number rounded up. | ## [](#examples)Examples ```fql Math.ceil(7.004) ``` ``` 8.0 ``` # `Math.cos()` Get the cosine of a [Number](../../../fql/types/#number) in radians. ## [](#signature)Signature ```fql-sig Math.cos(x: Number) => Number ``` ## [](#description)Description Gets the cosine of a [Number](../../../fql/types/#number) in radians. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | Number representing an angle in radians. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Cosine of the provided number between -1 and 1, inclusive. | ## [](#examples)Examples ```fql Math.cos(2 * Math.PI) ``` ``` 1.0 ``` # `Math.cosh()` Get the hyperbolic cosine of a [Number](../../../fql/types/#number). ## [](#signature)Signature ```fql-sig Math.cosh(x: Number) => Number ``` ## [](#description)Description Gets the hyperbolic cosine of a [Number](../../../fql/types/#number). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | A number. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Hyperbolic cosine of the provided number. | ## [](#examples)Examples ```fql Math.cosh(1) ``` ``` 1.543080634815244 ``` # `Math.degrees()` Convert radians to degrees. ## [](#signature)Signature ```fql-sig Math.degrees(x: Number) => Number ``` ## [](#description)Description Return the value of a [Number](../../../fql/types/#number) in radians converted to degrees. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | Number in radians. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Value of the provided number converted to degrees. | ## [](#examples)Examples ```fql Math.degrees(0.017453293) ``` ``` 1.0000000275052232 ``` # `Math.exp()` Get the value of ℮ raised to the power of a [Number](../../../fql/types/#number). ## [](#signature)Signature ```fql-sig Math.exp(x: Number) => Number ``` ## [](#description)Description Gets the value of _℮_ raised to the power of a provided [Number](../../../fql/types/#number). See [`Math.E`](../e/). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | A number. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Non-negative number representing ℮ to the power of a provided number, where ℮ is the base of the natural logarithm. | ## [](#examples)Examples ```fql let num = Math.exp(1) Math.trunc(num, 15) ``` ``` 2.718281828201507 ``` # `Math.floor()` Round down a [Number](../../../fql/types/#number). ## [](#signature)Signature ```fql-sig Math.floor(x: Number) => Number ``` ## [](#description)Description Gets the value of the provided [Number](../../../fql/types/#number) rounded down. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | Number to round down. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Value of the provided number rounded down. | ## [](#examples)Examples ```fql Math.floor(-45.95) ``` ``` -46.0 ``` # `Math.hypot()` Get the hypotenuse of a right triangle. ## [](#signature)Signature ```fql-sig Math.lypot(x: Number, y: Number) => Number ``` ## [](#description)Description Return the hypotenuse of a right triangle or the square root of the sum of squares. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | A number to be squared and added to the square of y. | | y | Number | true | A number to be squared and added to the square of x. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Hypotenuse of a right triangle or the sum of the squares of x and y. | ## [](#examples)Examples ```fql Math.hypot(3, 4) ``` ``` 5.0 ``` # `Math.log()` Get the natural logarithm, base e, of a [Number](../../../fql/types/#number). ## [](#signature)Signature ```fql-sig Math.log(x: Number) => Number ``` ## [](#description)Description Gets the natural logarithm, base e, of a provided [Number](../../../fql/types/#number). The [Number](../../../fql/types/#number) must be greater than or equal to zero. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | Number greater than or equal to zero. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Natural logarithm, base e, of the provided number. | ## [](#examples)Examples ```fql Math.log(10) ``` ``` 2.302585092994046 ``` # `Math.log10()` Get the base 10 logarithm of a [Number](../../../fql/types/#number). ## [](#signature)Signature ```fql-sig Math.log10(x: Number) => Number ``` ## [](#description)Description Gets the base 10 logarithm of a provided [Number](../../../fql/types/#number). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | Number greater than or equal to zero. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Base 10 logarithm of the provided number. | ## [](#examples)Examples ```fql Math.log10(2) ``` ``` 0.3010299956639812 ``` # `Math.max()` Get the larger of two [Number](../../../fql/types/#number)s. ## [](#signature)Signature ```fql-sig Math.max(x: Number, y: Number) => Number ``` ## [](#description)Description Gets the larger of two provided [Number](../../../fql/types/#number)s. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | A number. | | y | Number | true | A number. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Greater of the provided numbers. | ## [](#examples)Examples ```fql Math.max(10, 33) ``` ``` 33 ``` # `Math.mean()` Get the arithmetic mean of an [Array](../../../fql/types/#array) or [Set](../../../fql/types/#set) of [Numbers](../../../fql/types/#number). ## [](#signatures)Signatures ```fql-sig Math.mean(numbers: Array) => Number Math.mean(numbers: Set) => Number ``` ## [](#description)Description Gets the arithmetic mean of an [Array](../../../fql/types/#array) or [Set](../../../fql/types/#set) of [Numbers](../../../fql/types/#number). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | numbers | Array | Set | true | Numbers to average. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Arithmetic mean of the numbers. | ## [](#examples)Examples ### [](#average-an-array-of-numbers)Average an Array of numbers ```fql Math.mean([1, 2, 3, 4, 4]) ``` ``` 2.8 ``` ### [](#average-a-set-of-numbers)Average a Set of numbers ```fql // Converts an array to a Set. let set = [1, 2, 3, 4, 4.0].toSet() Math.mean(set) ``` ``` 2.8 ``` # `Math.min()` Get the smaller of the input parameter [Number](../../../fql/types/#number)s. ## [](#signature)Signature ```fql-sig Math.min(x: Number, y: Number) => Number ``` ## [](#description)Description Compare the input parameter [Number](../../../fql/types/#number)s and return the smaller [Number](../../../fql/types/#number). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | A number. | | y | Number | true | A number | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Smaller of the provided numbers. | ## [](#examples)Examples ```fql let a = 20; let b = 5; Math.min(a, b) ``` ``` 5 ``` # `Math.pow()` Get the value of a base raised to a power. ## [](#signature)Signature ```fql-sig Math.pow(x: Number, power: Number) => Number ``` ## [](#description)Description Gets the value of a base raised to a power. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | Base number. | | power | Number | true | Exponent number. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Number representing the provided base number taken to the power of the specified exponent. | ## [](#examples)Examples ```fql Math.pow(2, 8) ``` ``` 256.0 ``` # `Math.radians()` Convert the value of a [Number](../../../fql/types/#number) in degrees to radians. ## [](#signature)Signature ```fql-sig Math.radians(x: Number) => Number ``` ## [](#description)Description Gets the value of a [Number](../../../fql/types/#number) in degrees, in radians. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | Number of degrees. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Value of the provided degrees converted to radians. | ## [](#examples)Examples ```fql Math.radians(1) ``` ``` 0.017453292519943295 ``` # `Math.round()` Get the value of a [Number](../../../fql/types/#number) rounded to the nearest integer. ## [](#signature)Signature ```fql-sig Math.round(x: Number, precision: Number) => Number ``` ## [](#description)Description Gets the value of a [Number](../../../fql/types/#number) rounded to the nearest integer. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | Number to round. | | precision | Number | true | Number of decimal places to round to. If precision is negative, the method rounds to the left of the decimal point by the specified number of places. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Value of x rounded to the specified precision. | ## [](#examples)Examples ```fql Math.round(19.51555, 2) ``` ``` 19.52 ``` # `Math.sign()` Get the sign of a [Number](../../../fql/types/#number). ## [](#signature)Signature ```fql-sig Math.sign(x: Number) => Number ``` ## [](#description)Description Gets the sign of a provided [Number](../../../fql/types/#number). If the [Number](../../../fql/types/#number) is positive, returns `1`. If the [Number](../../../fql/types/#number) is negative, returns `-1`. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | A number. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | A number representing the sign of the provided number:1 = The number is positive.-1 = The number is negative. | ## [](#examples)Examples ```fql Math.sign(-3) ``` ``` -1 ``` # `Math.sin()` Get the sine of a [Number](../../../fql/types/#number) in radians. ## [](#signature)Signature ```fql-sig Math.sin(x: Number) => Number ``` ## [](#description)Description Gets the sine of a [Number](../../../fql/types/#number) in radians. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | Number representing an angle in radians. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | The sine of the provided number. | ## [](#examples)Examples ```fql Math.sin(1) ``` ``` 0.8414709848078965 ``` # `Math.sinh()` Get the hyperbolic sine of a [Number](../../../fql/types/#number). ## [](#signature)Signature ```fql-sig Math.sinh(x: Number) => Number ``` ## [](#description)Description Gets the hyperbolic sine of a [Number](../../../fql/types/#number). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | A number | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Hyperbolic sine of the provided number. | ## [](#examples)Examples ```fql Math.sinh(1) ``` ``` 1.1752011936438014 ``` # `Math.sqrt()` Get the square root of a [Number](../../../fql/types/#number). ## [](#signature)Signature ```fql-sig Math.sqrt(x: Number) => Number ``` ## [](#description)Description Gets the square root of a [Number](../../../fql/types/#number). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | Number greater than or equal to zero. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Square root of the provided number. | ## [](#examples)Examples ```fql Math.sqrt(2) ``` ``` 1.4142135623730951 ``` # `Math.sum()` Get the sum of an [Array](../../../fql/types/#array) or [Set](../../../fql/types/#set) of [Number](../../../fql/types/#number)s. ## [](#signatures)Signatures ```fql-sig Math.sum(numbers: Array) => Number Math.sum(numbers: Set) => Number ``` ## [](#description)Description Get the sum of an [Array](../../../fql/types/#array) or [Set](../../../fql/types/#set) of [Number](../../../fql/types/#number)s. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | numbers | Array or Set of Numbers | true | Numbers to sum. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Sum of the provided numbers. | ## [](#examples)Examples ### [](#sum-an-array-of-numbers)Sum an Array of numbers ```fql Math.sum([1, 2, 3]) ``` ``` 6 ``` ### [](#sum-a-set-of-numbers)Sum a Set of numbers ```fql // `toSet()` converts the Array of numbers to a Set. let set = [1, 2, 3].toSet() Math.sum(set) ``` ``` 6 ``` # `Math.tan()` Get the tangent of a [Number](../../../fql/types/#number) in radians. ## [](#signature)Signature ```fql-sig Math.tan(x: Number) => Number ``` ## [](#description)Description Gets the tangent of a [Number](../../../fql/types/#number) in radians. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | Number representing an angle in radians. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Tangent of the provided number. | ## [](#examples)Examples ```fql Math.tan(1) ``` ``` 1.5574077246549023 ``` # `Math.tanh()` Get the hyperbolic tangent of a [Number](../../../fql/types/#number). ## [](#signature)Signature ```fql-sig Math.tanh(x: Number) => Number ``` ## [](#description)Description Gets the hyperbolic tangent of a provided [Number](../../../fql/types/#number). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | A number. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Hyperbolic tangent of the provided number. | ## [](#examples)Examples ```fql Math.tanh(1) ``` ``` 0.7615941559557649 ``` # `Math.trunc()` Truncate a [Number](../../../fql/types/#number) to a given precision. ## [](#signature)Signature ```fql-sig Math.trunc(x: Number, precision: Number) => Number ``` ## [](#description)Description Returns a provided [Number](../../../fql/types/#number), truncated to a specified precision, depending on the underlying representation of the floating point number. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | x | Number | true | A number. | | precision | Number | true | Precision to truncate to. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | The provided number, truncated to the specified precision. | ## [](#examples)Examples ```fql Math.trunc(12.123, 1) ``` ``` 12.1 ``` Note that the result might reflect the underlying imprecision of the representation of a floating point number: ```fql Math.trunc(12.123, 3) ``` ``` 12.122 ``` # Object FQL [Objects](../../fql/types/#object) are collections of key-value pairs, similar to JSON Objects. The FQL API’s Object module contains methods that let you manage and manipulate Objects. ## [](#static-methods)Static methods You can use the following static methods to manage and manipulate Objects in FQL. | Method | Description | | --- | --- | --- | --- | | Object.assign() | Copies properties from a source Object to a destination Object. | | Object.entries() | Convert an Object to an Array of key-value pairs. | | Object.fromEntries() | Convert an Array of key-value pairs to an Object. | | Object.hasPath() | Test if an Object has a property. | | Object.keys() | Get an Object's top-level property keys as an Array. | | Object.select() | Get an Object property’s value by its path. | | Object.toString() | Convert an Object to a String. | | Object.values() | Get an Object's property values as an Array. | # `Object.assign()` Copies properties from a source [Object](../../../fql/types/#object) to a destination [Object](../../../fql/types/#object). ## [](#signature)Signature ```fql-sig Object.assign(destination: { *: A }, source: { *: B }) => { *: A | B } ``` ## [](#description)Description `Object.assign()` copies properties from a source Object to a destination Object. If the source and destination Object have a property with the same key, the value from the source Object is used. If a property in the source Object doesn’t exist in the destination Object, the property is added to the destination Object. `Object.assign()` doesn’t remove `null` properties. ### [](#convert-documents-to-objects)Convert documents to Objects You can use `Object.assign()` to convert a [document](../../../../learn/data-model/documents/) into an Object. This can be useful for transforming results and manipulating document data without mutating the underlying document. See [Convert a document to an Object](#doc-to-obj). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | destination | Object containing fields of Any type. | true | The destination Object that the source’s properties are copied to. | | source | Object containing fields of Any type. | true | Source Object containing properties to copy to the destination Object. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Any | Updated destination Object. | ## [](#examples)Examples ### [](#basic-example)Basic example The following examples copies properties from a source Object to an empty destination Object: ```fql Object.assign({ }, { a: 0, b: 'x' }) ``` ``` { a: 0, b: "x" } ``` ### [](#merge-properties)Merge properties You can use `Object.assign()` to merge the properties of the source and destination Object: ```fql Object.assign({ a: 0 }, { b: 'x' }) ``` ``` { a: 0, b: "x" } ``` ### [](#doc-to-obj)Convert a document to an Object You can use `Object.assign()` to convert a [document type](../../../../learn/data-model/documents/#document-type) into an [Object](../../../fql/types/#object): ```fql // Get a Category collection document. let category: Any = Category.byName("frozen").first() // Convert the Category document to an Object. Object.assign({ }, category) ``` The query returns an Object, not a document type: ``` { id: "456", coll: Category, ts: Time("2099-10-02T22:37:39.583Z"), products: { data: [ { id: "333", coll: Product, ts: Time("2099-10-02T22:37:39.583Z"), name: "pizza", description: "Frozen Cheese", price: 499, stock: 100, category: Category("456") } ] }, name: "frozen", description: "Frozen Foods" } ``` # `Object.entries()` Convert an [Object](../../../fql/types/#object) to an [Array](../../../fql/types/#array) of key-value pairs. ## [](#signature)Signature ```fql-sig Object.entries(object: { *: A }) => Array<[String, A]> ``` ## [](#description)Description `Object.entries()` gets an Object’s properties as Array elements. You can then iterate through the Array using one of the following methods: * [`array.forEach()`](../../array/foreach/) * [`array.map()`](../../array/map/) * [`array.flatMap()`](../../array/flatmap/) ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | object | Object containing fields of Any type. | true | Object to convert to an Array | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Array of the Object’s key-value pairs. | ## [](#examples)Examples ### [](#basic-example)Basic example ```fql // Object containing the name and stock quantity // of various products. let products = { bananas: 300, limes: 200, lemons: 500 } // Convert the Object to an Array. Object.entries(products) ``` ``` [ [ "bananas", 300 ], [ "limes", 200 ], [ "lemons", 500 ] ] ``` ### [](#iterate-through-an-object)Iterate through an Object You can use the following methods to iterate through the Array returned by `Object.entries()`: * [`array.forEach()`](../../array/foreach/) * [`array.map()`](../../array/map/) * [`array.flatMap()`](../../array/flatmap/) Extending the previous example: ```fql // Object containing the name and stock quantity // of various products. let products = { bananas: 300, limes: 200, lemons: 500 } // Convert the Object to an Array. let prodArray = Object.entries(products) // Iterate through the Array. prodArray.map(product => { // Concatenate each product's name and stock. product[0] + ": " + product[1].toString() }) ``` ``` [ "bananas: 300", "limes: 200", "lemons: 500" ] ``` ### [](#obj-transform)Transform Objects You can use [`Object.entries()`](./), [`Object.fromEntries()`](../fromentries/), and [Array methods](../../array/) to transform objects: ```fql // This query multiplies each Number in an Object // by 2. // An object containing various Numbers. let nums = { a: 1, b: 2, c: 3 } // Convert the Array to an Object. let numArray = Object.entries(nums) // Iterate through the Array, multiplying // each value and outputting an Object. Object.fromEntries( numArray.map(elem => { let key = elem[0] let value = elem[1] [ key, value * 2 ] }) ); ``` ``` { a: 2, b: 4, c: 6 } ``` # `Object.fromEntries()` Convert an [Array](../../../fql/types/#array) of key-value pairs to an [Object](../../../fql/types/#object). ## [](#signature)Signature ```fql-sig Object.fromEntries(entries: Array<[String, A]>) => { *: A } ``` ## [](#description)Description `Object.fromEntries()` creates an Object from an Array of key-value pairs. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | entries | Array | true | An Array containing nested Arrays of key-value pairs. Each nested Array represents an Object property and should have two elements:0: A String representing the property key.1: The property value. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Object | Object with properties from the entries Array. | ## [](#examples)Examples ### [](#basic-example)Basic example ```fql // Array containing the name and stock quantity // of various products. let products = [ [ "bananas", 300 ], [ "limes", 200 ], [ "lemons", 500 ] ] // Convert the Array to an Object. Object.fromEntries(products) ``` ``` { bananas: 300, limes: 200, lemons: 500 } ``` ### [](#dynamically-assign-object-keys)Dynamically assign object keys You can use [`Object.fromEntries()`](./) to dynamically assign object property keys. For example: ```fql let property1 = "hello" let property2 = "hi" let entries = [ [property1, "world"], [property2, "there"] ]; Object.fromEntries(entries); ``` ``` { hello: "world", hi: "there" } ``` ### [](#obj-transform)Transform Objects You can use [`Object.entries()`](../entries/), [`Object.fromEntries()`](./), and [Array methods](../../array/) to transform objects: ```fql // This query multiplies each Number in an Object // by 2. // An object containing various Numbers. let nums = { a: 1, b: 2, c: 3 } // Convert the Array to an Object. let numArray = Object.entries(nums) // Iterate through the Array, multiplying // each value and outputting an Object. Object.fromEntries( numArray.map(elem => { let key = elem[0] let value = elem[1] [ key, value * 2 ] }) ); ``` ``` { a: 2, b: 4, c: 6 } ``` # `Object.hasPath()` Test if an [Object](../../../fql/types/#object) has a property. ## [](#signature)Signature ```fql-sig Object.hasPath(object: { *: Any }, path: Array) => Boolean ``` ## [](#description)Description `Object.hasPath()` tests if an Object contains a property based on a provided path. The path is defined as an [Array](../../../fql/types/#array) of [Strings](../../../fql/types/#string), where each String represents a property key in the Object’s hierarchy. The method resolves the Object structure following the path to search for the property. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | object | Object containing fields of Any type. | true | Object to test for a property. | | path | Array of Strings | true | Path to an Object property. Each element in the Array represents a level in the Object’s hierarchy. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the Object contains the property. | ## [](#examples)Examples ### [](#basic-example)Basic example ```fql // Test if the Object contains the top-level `foo` property. Object.hasPath({ foo : 'bar' }, ['foo']) ``` ``` true ``` ### [](#property-that-doesnt-exist)Property that doesn’t exist ```fql // Test if the Object contains the top-level `baz` property. Object.hasPath({ foo : 'bar' }, ['baz']) ``` ``` false ``` ### [](#nested-property)Nested property ```fql // Defines an Object with customer data. let customer = { "name": "Ruby Von Rails", "email": "ruby@example.com", "address": { "street": "87856 Mendota Court", "city": "Washington", "state": "DC", "postalCode": "20220", "country": "US" } } // Test if the customer Object contains the // nested `address.state` property. Object.hasPath(customer, ['address', 'state']) ``` ``` true ``` ### [](#nested-property-that-doesnt-exist)Nested property that doesn’t exist ```fql // Defines an Object with customer data. let customer = { "name": "Ruby Von Rails", "email": "ruby@example.com", "address": { "street": "87856 Mendota Court", "city": "Washington", "state": "DC", "postalCode": "20220", "country": "US" } } // Test if the customer Object contains the // nested `address.zipCode` property. Object.hasPath(customer, ['address', 'zipCode']) ``` ``` false ``` ### [](#format-results-for-dynamic-projection)Format results for dynamic projection The following `getFormatter()` [user-defined function (UDF)](../../../../learn/schema/user-defined-functions/) uses `Object.hasPath()` to determine whether a predefined `formatterMap` object contains a property with a matching collection name. If not, it returns an [abort error](../../globals/abort/#error). The `getList()` UDF calls `getFormatter()` to return the projection for the provided collection. The UDFs are used for [dynamic projection](../../../fql/projection/#udf). ```fsl // Defines the `getFormatter()` UDF. // Accepts a collection name and returns a // format for the collection. function getFormatter(collName) { // Defines an object with a format // for each accepted collection name. let formatterMap = { Product: product => product { name, description, price }, Category: category => category { name, description } } // Use abort() to return an error if the // collection name doesn't have a format. if (!Object.hasPath(formatterMap, [collName])) { abort("No formatter named '#{collName}'") } formatterMap[collName] } // Returns the collection's data using the // predefined colleciton. function getList(collName) { let collection = Collection(collName) // Calls the previous `getFormatter()` UDF. let formatFn = getFormatter(collName) collection.all().map(formatFn) } ``` # `Object.keys()` Get an [Object](../../../fql/types/#object)'s top-level property keys as an [Array](../../../fql/types/#array). ## [](#signature)Signature ```fql-sig Object.keys(object: Any) => Array ``` ## [](#description)Description `Object.keys()` returns an Array with the top-level property keys of a provided Object. The Array does not contain keys for nested properties. `Object.keys()` does not change the original object. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | object | Object | true | Object to extract keys from. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array of Strings | An array containing the keys of the Object as Strings. | ## [](#examples)Examples ### [](#basic-example)Basic example ```fql Object.keys({ bananas: 300, limes: 200, lemons: 500 }) ``` ``` [ "bananas", "limes", "lemons" ] ``` ### [](#nested-object-properties)Nested Object properties `Object.keys()` only extracts an Object’s top-level property keys. It doesn’t extract nested property keys. ```fql Object.keys({ bananas: 300, limes: 200, lemons: { organic: 500 } }) ``` ``` // The results don't includes the nested `lemons.organic` // property key. [ "bananas", "limes", "lemons" ] ``` # `Object.select()` Get an [Object](../../../fql/types/#object) property’s value by its path. ## [](#signature)Signature ```fql-sig Object.select(object: { *: Any }, path: Array) => Any ``` ## [](#description)Description `Object.select()` gets an [Object](../../../fql/types/#object) property’s value by its path. The path is defined as an [Array](../../../fql/types/#array) of [Strings](../../../fql/types/#string), where each String represents a property key in the Object’s hierarchy. The method resolves the Object structure following the path to retrieve the desired value. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | object | Object containing fields of Any type | true | Object to extract the property from. | | path | Array of Strings | true | Path to a property in the Object. Each element in the Array represents a level in the Object’s hierarchy. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Any | Value of the property at the path. If the path doesn’t exist or is undefined, the value is null. | ## [](#examples)Examples ### [](#basic-example)Basic example ```fql // Gets the value of the top-level `foo` property. Object.select({ foo : 'bar' }, ['foo']) ``` ``` "bar" ``` ### [](#nested-property)Nested property ```fql // Defines an Object with customer data. let customer = { "name": "Ruby Von Rails", "email": "ruby@example.com", "address": { "street": "87856 Mendota Court", "city": "Washington", "state": "DC", "postalCode": "20220", "country": "US" } } // Gets the customer's nested `address.state` property. Object.select(customer, ['address', 'state']) ``` ``` "DC" ``` ### [](#retrieve-a-nested-object)Retrieve a nested Object ```fql // Defines an Object with customer data. let customer = { "name": "Ruby Von Rails", "email": "ruby@example.com", "address": { "street": "87856 Mendota Court", "city": "Washington", "state": "DC", "postalCode": "20220", "country": "US" } } // Gets the customer's nested `address` property, // which is an Object. Object.select(customer, ['address']) ``` ``` { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } ``` # `Object.toString()` Convert an [Object](../../../fql/types/#object) to a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig Object.toString(object: Any) => String ``` ## [](#description)Description `Object.toString()` creates a String representation of a provided Object. This method is useful for debugging, logging, or when you need a String representation of complex Objects. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | object | Any | true | Object to convert to a String. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | String representation of the Object. Special characters are properly escaped. | ## [](#examples)Examples ```fql Object.toString({a: "test"}) ``` ``` "{ a: \"test\" }" ``` ### [](#nested-object)Nested Object ```fql // Defines an Object with customer data. let customer = { "name": "Ruby Von Rails", "email": "ruby@example.com", "address": { "city": "Washington", "state": "DC" } } Object.toString(customer) ``` ``` "{ name: \"Ruby Von Rails\", email: \"ruby@example.com\", address: { city: \"Washington\", state: \"DC\" } }" ``` ### [](#object-with-multiple-data-types)Object with Multiple Data Types ```fql // Defines an Object. let data = { string: "Hello", number: 42, boolean: true, null: null, array: [1, 2, 3] } Object.toString(data) ``` ``` "{ string: \"Hello\", number: 42, boolean: true, null: null, array: [1, 2, 3] }" ``` ### [](#objects-containing-document-references)Objects containing document references ```fql // Defines an Object. let data = { // Contains a reference to a Category // collection document. category: Category.byName("produce").first(), } Object.toString(data) ``` ``` "{ category: { id: ID(\"789\"), coll: Category, ts: Time(\"2099-10-02T22:37:39.583Z\"), products: [set], name: \"produce\", description: \"Fresh Produce\" } }" ``` ### [](#objects-containing-set-references)Objects containing Set references ```fql // Defines an Object. let data = { // Contains a reference to a Set of all // Category collection documents. category: Category.all(), } Object.toString(data) ``` ``` "{ category: [set] }" ``` # `Object.values()` Get an [Object](../../../fql/types/#object)'s property values as an [Array](../../../fql/types/#array). ## [](#signature)Signature ```fql-sig Object.values(object: { *: A }) => Array ``` ## [](#description)Description `Object.values()` returns an [Array](../../../fql/types/#array) containing an Object’s property values. `Object.values()` does not change the original Object. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | object | Object containing fields of Any type. | true | Object to get property values from. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Array of values extracted from the Object. | ## [](#examples)Examples ## [](#basic-example)Basic example ```fql Object.values({ a: 0, b: 1 }) ``` ``` [ 0, 1 ] ``` ## [](#objects-with-non-scalar-values)Objects with non-scalar values Any nested Arrays in the original Object are left intact, resulting in a multi-dimensional Array. For example: ```fql Object.values({ a: [1, 2], b: 'foo' }) ``` ``` [ [ 1, 2 ], "foo" ] ``` Similarly, any nested Objects in the original Object are left intact: ```fql Object.values({ a: { x: 1, y: 2 }, b: 'foo' }) ``` ``` [ { x: 1, y: 2 }, "foo" ] ``` # Query ## [](#description)Description The `Query` namespace has security functions that return access information about a caller. ## [](#static-methods)Static methods | Method | Description | | --- | --- | --- | --- | | Query.identity() | Get the identity document for the query’s authentication token. | | Query.isEnvProtected() | Test if the queried database is in protected mode. | | Query.isEnvTypechecked() | Test if the queried database is typechecked. | | Query.token() | Get the Token document or JWT payload for the query’s authentication secret. | # `Query.identity()` | Learn: Attribute-based access control (ABAC) | | --- | --- | --- | Get the [identity document](../../../../learn/security/tokens/#identity-document) for the query’s [authentication token](../../../../learn/security/tokens/). ## [](#signature)Signature ```fql-sig Query.identity() => { *: Any } | Null ``` ## [](#description)Description Gets the [identity document](../../../../learn/security/tokens/#identity-document) for the query’s [authentication token](../../../../learn/security/tokens/). You can call `identity()` in role-related predicates used for [attribute-based access control (ABAC)](../../../../learn/security/abac/). If the query is authenticated using a [JWT](../../../../learn/security/access-providers/) or [key](../../../../learn/security/keys/), the method returns `null`. JWTs and keys aren’t tied to an identity document. ## [](#parameters)Parameters None ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Object | identity document for the authentication token. | | Null | No identity document is associated with the authentication secret. | ## [](#examples)Examples ```fql Query.identity() ``` ``` { id: "111", coll: Customer, ts: Time("2099-06-21T18:39:00.735Z"), cart: Order("413090255209497088"), orders: "hdW...", name: "Alice Appleseed", email: "alice.appleseed@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } } ``` # `Query.isEnvProtected()` Test if the queried database is in [protected mode](../../../../learn/schema/#protected-mode). ## [](#signature)Signature ```fql-sig Query.isEnvProtected() => Boolean ``` ## [](#description)Description Tests if the queried database is in [protected mode](../../../../learn/schema/#protected-mode), which prohibits destructive operations on a database’s collections. The method checks the database to which the query’s [authentication secret](../../../../learn/security/authentication/#secrets) is scoped. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the database is in protected mode. Otherwise, false. | ## [](#examples)Examples Call `isEnvProtected()` with built-in `Admin` role, on an unprotected database: ```fql Query.isEnvProtected() ``` ``` false ``` Calling `isEnvProtected()` with built-in `Server` role fails: ```fql Query.isEnvProtected() ``` ``` permission_denied: Insufficient privileges to perform the action. error: Insufficient privileges to perform the action. at *query*:1:21 | 1 | Query.isEnvProtected() | ^^ | ``` # `Query.isEnvTypechecked()` Test if the queried database is [typechecked](../../../../learn/query/static-typing/). ## [](#signature)Signature ```fql-sig Query.isEnvTypechecked() => Boolean ``` ## [](#description)Description Tests if the queried database is [typechecked](../../../../learn/query/static-typing/). The method checks the database to which the query’s [authentication secret](../../../../learn/security/authentication/#secrets) is scoped. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the database is in typechecked. Otherwise, false. | ## [](#examples)Examples ```fql Query.isEnvTypechecked() ``` ``` true ``` # `Query.token()` Get the [`Token` document](../../../../learn/security/tokens/) or [JWT payload](../../../../learn/security/access-providers/) for the query’s [authentication secret](../../../../learn/security/authentication/#secrets). ## [](#signature)Signature ```fql-sig Query.token() => { *: Any } | Null ``` ## [](#description)Description Gets the [`Token` document](../../../../learn/security/tokens/) or [JWT payload](../../../../learn/security/access-providers/) for the query’s [authentication secret](../../../../learn/security/authentication/#secrets). If the secret is a [token](../../../../learn/security/tokens/), the method returns the token’s [`Token` system collection document](../../token/). This token document is distinct from the token’s [identity document](../../../../learn/security/tokens/#identity-document). If the secret is a [JWT](../../../../learn/security/access-providers/), the method returns the JWT’s payload. If the secret is a [key](../../../../learn/security/keys/), the method returns `null`. ## [](#parameters)Parameters None ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Object | Token document or JWT payload. | | Null | The authentication secret is a key. | ## [](#examples)Examples ```fql Query.token() ``` ``` { id: "412664453937496576", coll: Token, ts: Time("2099-07-06T19:11:13.570Z"), document: Customer("111") } ``` # Role | Learn: Roles | | --- | --- | --- | Fauna uses [secrets](../../../learn/security/authentication/#secrets) for authentication and authorization. Roles determine a secret’s privileges, which control data access. ## [](#collection)`Role` collection Fauna stores user-defined roles as documents in the `Role` system collection. These documents are an FQL version of the FSL [role schema](../../fsl/role/). `Role` documents have the following FQL structure: ``` { name: "customer", coll: Role, ts: Time("2099-07-31T12:37:05.280Z"), privileges: [ { resource: "Product", actions: { read: true } }, { resource: "Order", actions: { read: "(ref) => Query.identity() == ref.customer" } }, { resource: "Customer", actions: { read: "(ref) => Query.identity() == ref" } }, { resource: "getOrCreateCart", actions: { call: "(id) => Query.identity()?.id == id" } }, { resource: "checkout", actions: { call: "(name) => true" } } ], membership: [ { resource: "Customer" }, { resource: "Manager", predicate: "(user) => user.accessLevel == \"manager\"" } ], data: { desc: "End user customer role" } } ``` | Field name | Value type | Read-only | Required | Description | | --- | --- | --- | --- | --- | --- | --- | | name | String | | true | Unique name for the role in the database.Must begin with a letter. Can only include letters, numbers, and underscores. admin, server, and server-readonly are reserved and can’t be used. | | coll | Collection | true | | Collection name: Role. | | ts | Time | true | | Last time the document was created or updated. | | privileges | Any | Null | | | Array of privilege objects. Each object allows one or more actions on a resource. See Privilege definition. | | membership | Any | Null | | | Array of membership objects. Each object assigns the role to tokens based on the token’s identity document. See Membership definition. | | data | { *: Any } | Null | | | Arbitrary user-defined metadata for the document. | ### [](#privilege-obj)Privilege definition The `privileges` field accepts an array of privilege objects. Privilege objects have the following schema: | Field name | Value type | Description | | --- | --- | --- | --- | --- | | resource | String | Name of a collection or user-defined function (UDF). Supports user-defined collections and the following system collections:AccessProviderCollectionCredentialDatabaseFunctionKeyRoleTokenread privileges grants the ability to call the collection’s indexes. | | actions | Object | Types of operations allowed on the resource.Each key in the object in an action. Privileges support different actions based on the resource type. See Privilege actions.Each key’s value is true, indicating the role is assigned the privilege unconditionally, or a predicate used to conditionally grant the privilege.Privilege predicates are passed different arguments based on their action. See Privilege predicate arguments. | #### [](#privilege-actions)Privilege actions Privileges support different actions based on their resource type. | Resource type | Action | Allows you to …​ | | --- | --- | --- | --- | --- | | Collection | create | Create documents in the collection. To create documents with a custom id, you must also have the create_with_id privilege. | | Collection | delete | Delete documents in the collection. | | Collection | read | Read documents in the collection. Can also call the collection’s indexes. To read historical document snapshots, you must also have the history_read privilege. | | Collection | write | Update or replace documents in the collection. | | Collection | create_with_id | Create documents with a custom id in the collection. You must also have the create privilege. | | Collection | history_read | Read snapshots for documents in the collection. See Run a temporal query. | | User-defined function (UDF) | call | Call the function. | #### [](#privilege-predicate-arguments)Privilege predicate arguments Privilege predicates are passed different arguments based on their action. | Action | Predicate function signature | | --- | --- | --- | --- | | create | (doc: Object) => Boolean | Null doc: Object containing the document to create. Includes metadata fields. | | delete | (doc: Object) => Boolean | Null doc: Object containing the document to delete. Includes metadata fields. | | read | (doc: Object) => Boolean | Null doc: Object containing the document to read. Includes metadata fields. | | write | (oldDoc: Object, newDoc: Object) => Boolean | Null oldDoc: Object containing the original document. Includes metadata fields.newDoc: Object containing the document to write. Includes metadata fields. | | create_with_id | (doc: Object) => Boolean | Null doc: Object containing the document to create. Includes metadata fields. | | history_read | (doc: Object) => Boolean | Null doc: Object containing the document to read. Includes metadata fields. | | call | (args: Array) => Boolean | Null args: Array containing the function call’s arguments. | ### [](#membership-obj)Membership definition The `membership` field accepts an array of privilege objects. Privilege objects have the following schema: | Field name | Value type | Description | | --- | --- | --- | --- | --- | | resource | String | Name of a user-defined collection.Fauna assigns the role to tokens with an identity document in the collection. | | predicate | String | Predicate used to conditionally assign the role. If the predicate is not true, the role is not assigned. If predicate is omitted, the role is assigned unconditionally.The predicate is passed one argument: an object containing the token’s identity document.The predicate runs with the built-in server role’s privileges. Supports shorthand syntax. | ## [](#static-methods)Static methods You can use the following static methods to manage the `Role` collection in FQL. | Method | Description | | --- | --- | --- | --- | | Role.all() | Get a Set of all user-defined roles. | | Role.byName() | Get a user-defined role by its name. | | Role.create() | Create a user-defined role. | | Role.firstWhere() | Get the first user-defined role matching a provided predicate. | | Role.toString() | Get "Role" as a String. | | Role.where() | Get a Set of user-defined roles that match a provided predicate. | ## [](#instance-methods)Instance methods You can use the following instance methods to manage specific `Role` documents in FQL. | Method | Description | | --- | --- | --- | --- | | role.delete() | Delete a user-defined role. | | role.exists() | Test if a user-defined role exists. | | role.replace() | Replace a user-defined role. | | role.update() | Update a user-defined role. | # `Role.all()` | Learn: Roles | | --- | --- | --- | Get a Set of all [user-defined roles](../../../../learn/security/roles/). ## [](#signature)Signature ```fql-sig Role.all() => Set Role.all(range: { from: Any } | { to: Any } | { from: Any, to: Any }) => Set ``` ## [](#description)Description Gets a Set containing all [user-defined roles](../../../../learn/security/roles/), represented as [`Role` documents](../), for the database. To limit the returned Set, you can provide an optional range. `Role` documents are FQL versions of a database’s FSL [role schema](../../../fsl/role/). See [Roles](../../../../learn/security/roles/). If this method is the last expression in a query, the first page of the Set is returned. See [Pagination](../../../../learn/query/pagination/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | range | { from: Any } | { to: Any } | { from: Any, to: Any } | | Specifies a range of Role documents in the form { from: start, to: end }. from and to arguments should be in the order returned by an unbounded Role.all() call. See Range examples.The Set only includes documents in this range (inclusive). Omit from or to to run unbounded range searches.If a range is omitted, all roles are returned. | ### [](#range-parameters)Range parameters | Name | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | from | Any | | Beginning of the range (inclusive). Must be an Role document. | | to | Any | | End of the range (inclusive). Must be an Role document. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set of Role documents in the provided range. If a range is omitted, all roles are returned.The Set is empty if:The database has no roles.There are no roles in the provided range.The provided range’s from value is greater than to. | ## [](#examples)Examples ### [](#range)Range examples 1. Get all roles for the database: ```fql Role.all() ``` ``` { data: [ { name: "manager", coll: Role, ts: Time("2099-10-28T16:14:20.640Z"), privileges: [ ... ], membership: [ ... ] }, { name: "customer", coll: Role, ts: Time("2099-10-28T16:14:20.640Z"), privileges: [ ... ], membership: [ ... ] } ] } ``` 2. Given the previous Set, get all roles starting with `manager` (inclusive): ```fql Role.all({ from: Role.byName("manager") }) ``` ``` { data: [ { name: "manager", coll: Role, ts: Time("2099-10-28T16:14:20.640Z"), privileges: [ ... ], membership: [ ... ] }, { name: "customer", coll: Role, ts: Time("2099-10-28T16:14:20.640Z"), privileges: [ ... ], membership: [ ... ] } ] } ``` 3. Get a Set of roles from `manager` (inclusive) to `customer` (inclusive): ```fql Role.all({ from: Role.byName("manager"), to: Role.byName("customer") }) ``` ``` { data: [ { name: "manager", coll: Role, ts: Time("2099-10-28T16:14:20.640Z"), privileges: [ ... ], membership: [ ... ] }, { name: "customer", coll: Role, ts: Time("2099-10-28T16:14:20.640Z"), privileges: [ ... ], membership: [ ... ] } ] } ``` 4. Get a Set of roles up to `customer` (inclusive): ```fql Role.all({ to: Role.byName("customer") }) ``` ``` { data: [ { name: "manager", coll: Role, ts: Time("2099-10-28T16:14:20.640Z"), privileges: [ ... ], membership: [ ... ] }, { name: "customer", coll: Role, ts: Time("2099-10-28T16:14:20.640Z"), privileges: [ ... ], membership: [ ... ] } ] } ``` # `Role.byName()` | Learn: Roles | | --- | --- | --- | Get a [user-defined role](../../../../learn/security/roles/) by its name. ## [](#signature)Signature ```fql-sig Role.byName(name: String) => NamedRef ``` ## [](#description)Description Gets a [user-defined role](../../../../learn/security/roles/), represented as an [`Role` document](../), by its name. `Role` documents are FQL versions of a database’s FSL [role schema](../../../fsl/role/). See [Roles](../../../../learn/security/roles/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | name | String | true | name of the Role document to retrieve. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | NamedRef | Resolved reference to the Role document. Can resolve to an existing document or a NullDoc. | ## [](#examples)Examples ```fql Role.byName("manager") ``` ``` { name: "manager", coll: Role, ts: Time("2099-10-28T16:01:40.805Z"), privileges: [ { resource: "OrderItem", actions: { create: true, read: true, write: true, delete: true } }, { resource: "Customer", actions: { read: true } }, { resource: "Manager", actions: { read: "(doc) => Query.identity() == doc && Date.today().dayOfWeek < 6" } }, { resource: "getOrCreateCart", actions: { call: true } }, { resource: "checkout", actions: { call: <<-END (args) => { let order = Order.byId(args[0])! order?.customer == Query.identity() } END } } ], membership: [ { resource: "Manager" }, { resource: "User", predicate: "(user) => user.accessLevel == \"manager\"" } ] } ``` # `Role.create()` | Learn: Roles | | --- | --- | --- | Create a [user-defined role](../../../../learn/security/roles/). ## [](#signature)Signature ```fql-sig Role.create(data: { name: String, privileges: Any | Null, membership: Any | Null, data: { *: Any } | Null }) => Role ``` ## [](#description)Description Creates a [user-defined role](../../../../learn/security/roles/) with the provided document fields. Fauna stores user-defined roles as documents in the [`Role` system collection](../). `Role` documents are FQL versions of a database’s FSL [role schema](../../../fsl/role/). See [Roles](../../../../learn/security/roles/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method adds a role to the staged schema, not the active schema. If the database has no staged schema, using this method is equivalent to making an [unstaged schema change](../../../../learn/schema/manage-schema/#unstaged). Changes are applied immediately to the database’s active schema. #### [](#concurrent)Avoid concurrent schema changes Concurrent [unstaged schema changes](../../../../learn/schema/manage-schema/#unstaged) can cause [contended transactions](../../../../learn/transactions/contention/), even if the changes affect different resources. This includes unstaged changes made using: * The [Fauna CLI](../../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../../learn/schema/manage-schema/#fql) A schema change triggers a [transaction that validates the entire database schema](../../../../learn/schema/#validation). To avoid errors, do one of the following instead: * Run [staged schema changes](../../../../learn/schema/manage-schema/#staged) * Perform unstaged schema changes sequentially ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | | Document fields for the new Role document.For supported document fields, see Role collection. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Role | The new Role document. | ## [](#role-create-example)Examples ```fql Role.create({ name: "customer", privileges: [ { resource: "Product", actions: { read: true } }, { resource: "Order", actions: { read: "(ref) => Query.identity() == ref.customer" } }, { resource: "Customer", actions: { read: "(ref) => Query.identity() == ref" } }, { resource: "getOrCreateCart", actions: { call: "(id) => Query.identity()?.id == id" } }, { resource: "checkout", actions: { call: "(name) => true" } } ], membership: [ { resource: "Customer" } ], data: { desc: "End user customer role" } }) ``` ``` { name: "customer", coll: Role, ts: Time("2099-07-31T12:37:05.280Z"), privileges: [ { resource: "Product", actions: { read: true } }, { resource: "Order", actions: { read: "(ref) => Query.identity() == ref.customer" } }, { resource: "Customer", actions: { read: "(ref) => Query.identity() == ref" } }, { resource: "getOrCreateCart", actions: { call: "(id) => Query.identity()?.id == id" } }, { resource: "checkout", actions: { call: "(name) => true" } } ], membership: [ { resource: "Customer" } ], data: { desc: "End user customer role" } } ``` # `Role.firstWhere()` | Learn: Roles | | --- | --- | --- | Get the first [user-defined role](../../../../learn/security/roles/) matching a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig Role.firstWhere(pred: (Role => Boolean)) => Role | Null ``` ## [](#description)Description Gets the first [user-defined role](../../../../learn/security/roles/), represented as a [`Role` document](../), that matches a provided [predicate function](../../../fql/functions/#predicates). `Role` documents are FQL versions of a database’s FSL [role schema](../../../fsl/role/). See [Roles](../../../../learn/security/roles/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pred | Predicate function | Yes | Anonymous predicate function that:Accepts a Role document as its only argument. Supports shorthand-syntax.Returns a Boolean value.The method returns the first Role document for which the predicate returns true. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Role | First Role document that matches the predicate. | | Null | No Role document matches the predicate. | ## [](#examples)Examples ```fql Role.firstWhere(.name.includes("manager")) ``` ``` { name: "manager", coll: Role, ts: Time("2099-10-28T16:01:40.805Z"), privileges: [ { resource: "OrderItem", actions: { create: true, read: true, write: true, delete: true } }, { resource: "Customer", actions: { read: true } }, { resource: "Manager", actions: { read: "(doc) => Query.identity() == doc && Date.today().dayOfWeek < 6" } }, { resource: "getOrCreateCart", actions: { call: true } }, { resource: "checkout", actions: { call: <<-END (args) => { let order = Order.byId(args[0])! order?.customer == Query.identity() } END } } ], membership: [ { resource: "Manager" }, { resource: "User", predicate: "(user) => user.accessLevel == \"manager\"" } ] } ``` # `Role.toString()` | Learn: Roles | | --- | --- | --- | Get `"Role"` as a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig Role.toString() => String ``` ## [](#description)Description Returns the name of the [`Role` collection](../) as a [String](../../../fql/types/#string). ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | "Role" | ## [](#examples)Examples ```fql Role.toString() ``` ``` "Role" ``` # `Role.where()` | Learn: Roles | | --- | --- | --- | Get a Set of [user-defined roles](../../../../learn/security/roles/) that match a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig Role.where(pred: (Role => Boolean)) => Set ``` ## [](#description)Description Gets a Set of [user-defined roles](../../../../learn/security/roles/), represented as [`Role` documents](../), that match a provided [predicate function](../../../fql/functions/#predicates). `Role` documents are FQL versions of a database’s FSL [role schema](../../../fsl/role/). See [Roles](../../../../learn/security/roles/). If `Role.where()` is the last expression in a query, the first page of the `Set` is returned. See [Pagination](../../../../learn/query/pagination/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pred | Predicate function | Yes | Anonymous predicate function that:Accepts an Role document as its only argument. Supports shorthand-syntax.Returns a Boolean value.The method returns a Set of Role documents for which the predicate returns true. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set of Role documents that match the predicate. If there are no matching documents, the Set is empty. | ## [](#examples)Examples ```fql Role.where(.name.includes("manager")) ``` ``` { data: [ { name: "manager", coll: Role, ts: Time("2099-10-28T16:01:40.805Z"), privileges: [ { resource: "OrderItem", actions: { create: true, read: true, write: true, delete: true } }, { resource: "Customer", actions: { read: true } }, { resource: "Manager", actions: { read: "(doc) => Query.identity() == doc && Date.today().dayOfWeek < 6" } }, { resource: "getOrCreateCart", actions: { call: true } }, { resource: "checkout", actions: { call: <<-END (args) => { let order = Order.byId(args[0])! order?.customer == Query.identity() } END } } ], membership: [ { resource: "Manager" }, { resource: "User", predicate: "(user) => user.accessLevel == \"manager\"" } ] } ] } ``` # `role.delete()` | Learn: Roles | | --- | --- | --- | Delete a [user-defined role](../../../../learn/security/roles/). ## [](#signature)Signature ```fql-sig delete() => NullRole ``` ## [](#description)Description Deletes a [user-defined role](../../../../learn/security/roles/), represented as a [`Role` document](../). `Role` documents are FQL versions of a database’s FSL [role schema](../../../fsl/role/). See [Roles](../../../../learn/security/roles/). ### [](#staged-schema)Staged schema You can’t delete a role while a database has [staged schema](../../../../learn/schema/manage-schema/#staged). If the database has no staged schema, using this method is equivalent to making an [unstaged schema change](../../../../learn/schema/manage-schema/#unstaged). Changes are applied immediately to the database’s active schema. #### [](#concurrent)Avoid concurrent schema changes Concurrent [unstaged schema changes](../../../../learn/schema/manage-schema/#unstaged) can cause [contended transactions](../../../../learn/transactions/contention/), even if the changes affect different resources. This includes unstaged changes made using: * The [Fauna CLI](../../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../../learn/schema/manage-schema/#fql) A schema change triggers a [transaction that validates the entire database schema](../../../../learn/schema/#validation). To avoid errors, do one of the following instead: * Run [staged schema changes](../../../../learn/schema/manage-schema/#staged) * Perform unstaged schema changes sequentially ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | NullRole | Document doesn’t exist. See NullDoc. | ## [](#examples)Examples ```fql Role.byName("customer")!.delete() ``` ``` Role.byName("customer") /* deleted */ ``` # `role.exists()` | Learn: Roles | | --- | --- | --- | Test if a [user-defined role](../../../../learn/security/roles/) exists. ## [](#signature)Signature ```fql-sig exists() => Boolean ``` ## [](#description)Description Tests if a [UDF](../../../../learn/security/roles/), represented as an [`Role` document](../), exists. `Role` documents are FQL versions of a database’s FSL [role schema](../../../fsl/role/). See [Roles](../../../../learn/security/roles/). ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. ### [](#exists-vs-null-comparisons)`exists()` vs. null comparisons You can use either `exists()` or a null comparison (`== null` or `!= null`) to check the existence or validity of a value. For example: ```fql Role.byName("manager").exists() // true Role.byName("manager") != null // true ``` Key differences: * `exists()` returns an error if called on an unsupported value. * Null comparisons do not throw errors and work safely on any value. For example: ```fql // Declare an object. Objects don't support // an `exists()` method. let object = { a: "Foo", b: "Bar" } object.exists() // Returns `invalid_query` error object != null // Returns true ``` ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the Role document exists. If false, the Role document doesn’t exist. | ## [](#examples)Examples ```fql Role.byName("manager").exists() ``` ``` true ``` # `role.replace()` | Learn: Roles | | --- | --- | --- | Replace a [user-defined role](../../../../learn/security/roles/). ## [](#signature)Signature ```fql-sig replace(data: { *: Any }) => Role ``` ## [](#description)Description Replaces all fields in a [user-defined role](../../../../learn/security/roles/), represented as an [`Role` document](../), with fields from a provided data object. Fields not present in the data object, excluding the `coll` and `ts` metadata fields, are removed. `Role` documents are FQL versions of a database’s FSL [role schema](../../../fsl/role/). See [Roles](../../../../learn/security/roles/). ### [](#metadata-fields)Metadata fields You can’t use this method to insert or edit the following [metadata fields](../../../../learn/data-model/documents/#meta): * `coll` * `ts` ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. You can’t rename a role while a database has staged schema. If the database has no staged schema, using this method is equivalent to making an [unstaged schema change](../../../../learn/schema/manage-schema/#unstaged). Changes are applied immediately to the database’s active schema. #### [](#concurrent)Avoid concurrent schema changes Concurrent [unstaged schema changes](../../../../learn/schema/manage-schema/#unstaged) can cause [contended transactions](../../../../learn/transactions/contention/), even if the changes affect different resources. This includes unstaged changes made using: * The [Fauna CLI](../../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../../learn/schema/manage-schema/#fql) A schema change triggers a [transaction that validates the entire database schema](../../../../learn/schema/#validation). To avoid errors, do one of the following instead: * Run [staged schema changes](../../../../learn/schema/manage-schema/#staged) * Perform unstaged schema changes sequentially ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | true | Fields for the Role document. Fields not present, excluding the coll and ts metadata fields, in the object are removed.For supported document fields, see Role collection.The object can’t include the following metadata fields:* coll * ts | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Role | Role document with replaced fields. | ## [](#examples)Examples ```fql Role.byName("manager")?.replace({ name: "manager", privileges: [ { resource: "OrderItem", actions: { create: true, read: true, write: true, delete: true } }, { resource: "Customer", actions: { read: true } }, { resource: "Manager", actions: { read: "(doc) => Query.identity() == doc && Date.today().dayOfWeek < 6" } }, { resource: "getOrCreateCart", actions: { call: true } }, { resource: "checkout", actions: { call: <<-END (args) => { let order = Order.byId(args[0])! order?.customer == Query.identity() } END } } ], membership: [ { resource: "Manager" }, { resource: "User", predicate: "(user) => user.accessLevel == \"manager\"" } ] }) ``` ``` { name: "manager", coll: Role, ts: Time("2099-10-28T16:14:20.640Z"), privileges: [ { resource: "OrderItem", actions: { create: true, read: true, write: true, delete: true } }, { resource: "Customer", actions: { read: true } }, { resource: "Manager", actions: { read: "(doc) => Query.identity() == doc && Date.today().dayOfWeek < 6" } }, { resource: "getOrCreateCart", actions: { call: true } }, { resource: "checkout", actions: { call: <<-END (args) => { let order = Order.byId(args[0])! order?.customer == Query.identity() } END } } ], membership: [ { resource: "Manager" }, { resource: "User", predicate: "(user) => user.accessLevel == \"manager\"" } ] } ``` # `role.update()` | Learn: Roles | | --- | --- | --- | Update a [user-defined role](../../../../learn/security/roles/). ## [](#signature)Signature ```fql-sig update(data: { *: Any }) => Role ``` ## [](#description)Description Updates a [user-defined role](../../../../learn/security/roles/), represented as an [`Role` document](../../accessprovider/), with fields from a provided data object. During the update, fields from the data object are copied to the document, creating new fields or updating existing fields. The operation is similar to a merge. `Role` documents are FQL versions of a database’s FSL [role schema](../../../fsl/role/). See [Roles](../../../../learn/security/roles/). ### [](#nested-fields)Nested fields Fields with nested objects in the data object are merged with the identically named nested object in the document. ### [](#remove-a-field)Remove a field To remove a document field, set its value in the data object to `null`. ### [](#metadata-fields)Metadata fields You can’t use this method to insert or edit the following [metadata fields](../../../../learn/data-model/documents/#meta): * `coll` * `ts` ### [](#staged-schema)Staged schema If a database has [staged schema](../../../../learn/schema/manage-schema/#staged), this method interacts with the database’s staged schema, not the active schema. You can’t rename a role while a database has staged schema. If the database has no staged schema, using this method is equivalent to making an [unstaged schema change](../../../../learn/schema/manage-schema/#unstaged). Changes are applied immediately to the database’s active schema. #### [](#concurrent)Avoid concurrent schema changes Concurrent [unstaged schema changes](../../../../learn/schema/manage-schema/#unstaged) can cause [contended transactions](../../../../learn/transactions/contention/), even if the changes affect different resources. This includes unstaged changes made using: * The [Fauna CLI](../../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../../learn/schema/manage-schema/#fql) A schema change triggers a [transaction that validates the entire database schema](../../../../learn/schema/#validation). To avoid errors, do one of the following instead: * Run [staged schema changes](../../../../learn/schema/manage-schema/#staged) * Perform unstaged schema changes sequentially ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | | Document fields for the Role document.For supported document fields, see Role collection.The object can’t include the following metadata fields:* coll * ts | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Role | The updated Role document. | ## [](#examples)Examples ```fql Role.byName("manager")?.update({ membership: [ { resource: "Manager" } ], privileges: [ { resource: "Order", actions: { create: false, delete: false, read: true, write: false } }, { resource: "Product", actions: { create: true } } ], data: { custom: "some data" } }) ``` ``` { name: "manager", coll: Role, ts: Time("2099-07-27T22:40:32.735Z"), privileges: [ { resource: "Order", actions: { create: false, delete: false, read: true, write: false } }, { resource: "Product", actions: { create: true } } ], membership: [ { resource: "Manager" } ], data: { custom: "some data" } } ``` # Set | Learn: Sets | | --- | --- | --- | [Set](../../fql/types/#set) methods and properties. ## [](#description)Description A [Set](../../fql/types/#set) represents a group of values, for example, documents in a [Collection](../../fql/types/#collection). When a [Set](../../fql/types/#set) is returned from a query, it is materialized into a page of results that includes a subset of the [Set](../../fql/types/#set) with a pagination cursor. ## [](#eager)Lazy vs. Eager loading To minimize resource consumption, Set methods use lazy loading where possible. These methods defer the materialization of the calling Set and related computations until the data is explicitly needed. Other Set methods, such as [`set.includes()`](includes/), require eager loading. These methods materialize the entire calling Set upfront, even if not all data is returned in the results. For unindexed document Sets, this requires a read of each document in the Set. ## [](#static-methods)Static methods | Method | Description | | --- | --- | --- | --- | | Set.paginate() | Get a page of paginated results using an after cursor. | | Set.sequence() | Create an ordered Set of Numbers given start and end values. | | Set.single() | Create a Set containing a single provided element. | ## [](#instance-methods)Instance methods | Method | Description | | --- | --- | --- | --- | | set.aggregate() | Aggregate all elements of a Set. | | set.any() | Test if any element of a Set matches a provided predicate. | | set.changesOn() | Create an event source that tracks changes to specified document fields in a supported Set. | | set.concat() | Concatenate two Sets. | | set.count() | Get the number of elements in a Set. | | set.distinct() | Get the unique elements of a Set. | | set.drop() | Drop the first N elements of a Set. | | set.eventsOn() | Create an event source that tracks changes to specified document fields in a supported Set. | | set.eventSource() | Create an event source that tracks changes to documents in a supported Set. | | set.every() | Test if every element of a Set matches a provided predicate. | | set.first() | Get the first element of a Set. | | set.firstWhere() | Get the first element of a Set that matches a provided predicate. | | set.flatMap() | Apply a provided function to each Set element and flatten the resulting Set by one level. | | set.fold() | Reduce the Set to a single, accumulated value by applying a provided function to each element. Iterates through elements from left to right. Uses a provided seed as the initial value. | | set.foldRight() | Reduce a Set to a single, accumulated value by applying a provided function to each element. Iterates through elements from right to left. Uses a provided seed as the initial value. | | set.forEach() | Run a provided function on each element of a Set. Can perform writes. | | set.includes() | Test if the Set includes a provided element. | | set.isEmpty() | Test if a Set is empty. | | set.last() | Get the last element of a Set. | | set.lastWhere() | Get the last element of a Set that matches a provided predicate. | | set.map() | Apply a provided function to each element of a Set. Can’t perform writes. | | set.nonEmpty() | Test if a Set is not empty. | | set.order() | Sort a Set's elements. | | set.pageSize() | Set the maximum elements per page in paginated results. | | set.paginate() | Convert a Set to an Object with pagination. | | set.reduce() | Reduce a Set to a single, accumulated value by applying a provided function to each element. Iterates through elements from left to right. Uses the first element as the initial value. | | set.reduceRight() | Reduce a Set to a single, accumulated value by applying a provided function to each element. Iterates through elements from right to left. Uses the first element as the initial value. | | set.reverse() | Reverse the order of a Set's elements. | | set.take() | Get the first N elements of a Set. | | set.toArray() | Convert a Set to an Array. | | set.toStream() | Create an event source that tracks changes to documents in a supported Set. | | set.toString() | Return the string "[set]". | | set.where() | Get the elements of a Set that match a provided predicate. | # `Set.paginate()` | Learn: Sets | | --- | --- | --- | Get a page of [paginated results](../../../../learn/query/pagination/) using an `after` cursor. ## [](#signature)Signature ```fql-sig Set.paginate(cursor: String | SetCursor) => { data: Array, after: String | Null } Set.paginate(cursor: String | SetCursor, size: Number) => { data: Array, after: String | Null } ``` ## [](#description)Description Gets a page of [paginated results](../../../../learn/query/pagination/) using an [`after` cursor](../../../../learn/query/pagination/#cursor). The default page size of 16 can be changed using the [`set.paginate()`](../paginate/) or [`set.pageSize()`](../pagesize/) method, in the range 1 to 16000 (inclusive). The cursor is stable in the sense that pagination through a Set is done for a fixed snapshot time, giving you a view of your data as it existed across the whole Set at the instant you started paginating. For example, given Set \[_a_, _b_, _c_\] when you start paginating, one item at a time, even if you delete item _c_ after you started reading the Set, item _c_ is returned. The exception is if the history is no longer available for the deleted item because `history_days` uses the default value of `0` or is less than the minimum valid time needed. In that case, the deleted item is not returned with the paginated results and an error is returned: `Requested timestamp ``` ## [](#description)Description Creates a [Set](../../../fql/types/#set) containing a single element. The element is passed to the method as an argument. You can use an event source to track changes to a [Set](../../../fql/types/#set) containing a single document. These event sources only emit events when the document changes. You typically use `Set.single()` to create a [Set](../../../fql/types/#set) from a [Document](../../../fql/types/#document) for a [document event source](../../../../learn/cdc/#doc). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | element | Generic | true | Value for the returned Set's element. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set containing the provided element. | ## [](#examples)Examples The following uses `single()` to creates a [Set](../../../fql/types/#set) from a single document. ```fql // Uses the `Product` collection's `byName()` index and // the `first()` method to get a single document. let product = Product.byName("cups").first() Set.single(product) ``` ``` { data: [ { id: "111", coll: Product, ts: Time("2099-03-21T19:40:45.430Z"), name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698, stock: 100, category: Category("123") } ] } ``` You can call [`set.eventsOn()`](../eventson/) or [`set.eventSource()`](../eventsource/) on the [Set](../../../fql/types/#set) to create a [document event sources](../../../../learn/cdc/#doc). ```fql let product = Product.byName("cups").first() Set.single(product).eventSource() ``` The query returns an event source: ``` "g9WD1YPG..." ``` # `set.aggregate()` | Learn: Sets | | --- | --- | --- | Aggregate all elements of a Set. ## [](#signature)Signature ```fql-sig aggregate(seed: A, combiner: (A, A) => A) => A ``` ## [](#description)Description Aggregates all elements of the calling Set. There is no ordering expectation. The calling Set isn’t changed. ### [](#eager-loading)Eager loading This method uses [eager loading](../#eager) and requires a read of each document in the calling Set. For large Sets, this may result in poor performance and high costs. Performance hint: `full_set_read` Queries that call this method on a document Set emit a [performance hint](../../../http/reference/query-summary/#perf), if enabled. For example, the following query: ```fql // Use `aggregate()` to sum stock counts let stockCounts = Product.all().map(doc => doc.stock ) stockCounts.aggregate(0, (a, b) => a + b) ``` Emits the following hint: ``` performance_hint: full_set_read - Using aggregate() causes the full set to be read. See https://docs.faunadb.org/performance_hint/full_set_read. at *query*:3:22 | 3 | stockCounts.aggregate(0, (a, b) => a + b) | ^^^^^^^^^^^^^^^^^^^^ | ``` To address the hint, use [`set.take()`](../take/) to explicitly limit the size of the calling Set to fewer than 100 documents: ```fql // Limit the doc Set's size using `take()` let stockCounts = Product.all().take(20).map(doc => doc.stock ) stockCounts.aggregate(0, (a, b) => a + b) ``` This applies even if the original, unbounded Set contains fewer than 100 documents. Alternatively, you can rewrite the query to avoid calling the method. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | seed | Number | true | Initial state value. | | combiner | Function | true | Anonymous FQL function that aggregates the elements. | ### [](#combiner-function-parameters)Combiner function parameters: | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | accumulator | Generic | | Value returned by the previous invocation. | | current | Generic | | Current Set value. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Generic | Aggregate of the iterable. If the iterable is empty, the seed is returned. | ## [](#examples)Examples ```fql // `toSet()` converts an Array to a Set. let set = [1, 2, 3, 4].toSet() set.aggregate(0, (a, b) => a + b) ``` ``` 10 ``` # `set.any()` | Learn: Sets | | --- | --- | --- | Test if any element of a Set matches a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig any(predicate: (A => Boolean | Null)) => Boolean ``` ## [](#description)Description Tests if any element of the calling Set matches a provided [predicate function](../../../fql/functions/#predicates). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | predicate | Predicate function | true | Anonymous predicate function that:Accepts a Set element as its only argument. Supports shorthand-syntax for objects and documents.Returns Boolean or Null.The method returns true if the predicate is true for any element in the Set. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the predicate is true for one or more elements in the Set. Otherwise, false. | ## [](#examples)Examples ```fql // `toSet()` converts an Array to a Set. let set = [1, 2, 3].toSet() set.any(v => v == 2) ``` ``` true ``` # `set.concat()` | Learn: Sets | | --- | --- | --- | Concatenate two [Set](../../../fql/types/#set)s. | Loading strategy: | Lazy loading | | --- | --- | --- | --- | ## [](#signature)Signature ```fql-sig concat(other: Set) => Set ``` ## [](#description)Description Creates a [Set](../../../fql/types/#set) by copying the calling Set to a new Set and appending another Set. The calling Set and the other Set aren’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | other | Set | true | Set to append to the calling Set. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | New Set composed of the concatenated Sets. | ## [](#examples)Examples ```fql // `toSet()` converts an Array to a Set. let setA = [1, 2, 3].toSet() let setB = [4, 5, 6].toSet() setA.concat(setB) ``` ``` { data: [ 1, 2, 3, 4, 5, 6 ] } ``` # `set.count()` | Learn: Sets | | --- | --- | --- | Get the number of elements in a [Set](../../../fql/types/#set). ## [](#signature)Signature ```fql-sig count() => Number ``` ## [](#description)Description Gets the number of elements in the calling [Set](../../../fql/types/#set). ### [](#eager-loading)Eager loading This method uses [eager loading](../#eager) and requires a read of each document in the calling Set. For large Sets, this may result in poor performance and high costs. Performance hint: `full_set_read` Queries that call this method on a document Set emit a [performance hint](../../../http/reference/query-summary/#perf), if enabled. For example, the following query: ```fql // Use `count()` to get a count of all products Product.all().count() ``` Emits the following hint: ``` performance_hint: full_set_read - Using count() causes the full set to be read. See https://docs.faunadb.org/performance_hint/full_set_read. at *query*:2:20 | 2 | Product.all().count() | ^^ | ``` To address the hint, use [`set.take()`](../take/) to explicitly limit the size of the calling Set to fewer than 100 documents: ```fql // Limit the doc Set's size using `take()` Product.all().take(99).count() ``` This applies even if the original, unbounded Set contains fewer than 100 documents. Alternatively, you can rewrite the query to avoid calling the method. ### [](#avoid-use-on-large-sets)Avoid use on large sets This method scans the full [Set](../../../fql/types/#set), which can cause many reads and might time out for large [Set](../../../fql/types/#set)s. ### [](#considerations)Considerations A document update stores a new version of the document for which counter data is poorly suited for database storage. If a frequently updated counter is essential, an event-sourcing technique is recommended to reduce database contention and reduce unnecessary database operations. If the event sourcing pattern isn’t suitable, the following improvements might be considered: * Set the collection’s [`history_days` setting](../../../../learn/doc-history/) to a small value, with a zero value recommended. Document history continues to be collected, but is removed sooner than the default zero days. * Periodically, run a query to explicitly remove document history. * Instead of attempting to implement a real-time counter, consider storing countable documents as a cache and periodically analyzing cache contents to update a reporting document. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Count of values in the Set. | ## [](#examples)Examples Get a count `Product` documents with a `name` of `limes`: ```fql Product.all().count() ``` ``` 9 ``` # `set.distinct()` | Learn: Sets | | --- | --- | --- | Get the unique elements of a [Set](../../../fql/types/#set). ## [](#signature)Signature ```fql-sig distinct() => Set ``` ## [](#description)Description Gets the unique elements of the calling [Set](../../../fql/types/#set). The calling Set isn’t changed. ### [](#eager-loading)Eager loading This method uses [eager loading](../#eager) and requires a read of each document in the calling Set. For large Sets, this may result in poor performance and high costs. Performance hint: `full_set_read` Queries that call this method on a document Set emit a [performance hint](../../../http/reference/query-summary/#perf), if enabled. For example, the following query: ```fql // Use `distinct()` to get all unique products Product.all().distinct() ``` Emits the following hint: ``` performance_hint: full_set_read - Using distinct() causes the full set to be read. See https://docs.faunadb.org/performance_hint/full_set_read. at *query*:2:23 | 2 | Product.all().distinct() | ^^ | ``` To address the hint, use [`set.take()`](../take/) to explicitly limit the size of the calling Set to fewer than 100 documents: ```fql // Limit the doc Set's size using `take()` Product.all().take(20).distinct() ``` This applies even if the original, unbounded Set contains fewer than 100 documents. Alternatively, you can rewrite the query to avoid calling the method. ### [](#avoid-use-on-large-sets)Avoid use on large Sets Avoid using `distinct()` on large or unbounded Sets that contain 16,000 or more documents. If a Set contains 16,000 or more documents, the query requires pagination. [`array.distinct()`](../../array/distinct/) would only be able to extract unique elements from each page of results. Instead, retrieve all field values and process them on the client side. See [Get unique field values](../../../../learn/query/patterns/get-unique-values/). ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Unique elements in the Set. | ## [](#examples)Examples ### [](#basic-example)Basic example ```fql // `toSet()` converts an Array to a Set. let set = [1, 1, 2, 3, 3].toSet() set.distinct() ``` ``` { data: [ 1, 2, 3 ] } ``` ### [](#get-unique-products-ordered-by-a-customer)Get unique products ordered by a customer In this example, you’ll use `distinct()` to get a Set of unique products ordered by a specific customer. The example uses the `Order` collection. `Order` collection documents have the following structure: ``` { id: "12345", coll: Order, ts: Time("2099-07-31T12:42:19Z"), // `items` contains a Set of `OrderItem` documents. items: { data: [ { id: "112233", coll: OrderItem, ts: Time("2099-07-31T12:42:19Z"), order: Order("12345"), product: Product("111"), quantity: 2 }, ... ] }, total: 5392, status: "cart", // `customer` contains a `Customer` document. customer: { id: "111", coll: Customer, ts: Time("2099-07-31T12:42:19Z"), cart: Order("412483941752112205"), // `orders` contains a Set of `OrderItem` documents. orders: "hdW...", name: "Alice Appleseed", email: "alice.appleseed@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } }, createdAt: Time("2099-07-31T12:42:18.774426Z"), payment: {} } ``` The query: ```fql // Uses the `Customer` collection's `byEmail()` index to // get `Customer` collection documents by `email` field value. // In the `Customer` collection, `email` field values are unique // so return the `first()` (and only) document. let customer = Customer.byEmail("alice.appleseed@example.com").first() // Uses the `Order` collection's `byCustomer()` index to // get `Order` collection documents for the previous customer. Order.byCustomer(customer) // `Order` documents include a `items` field that contains a Set // of `OrderItem` documents. Each `OrderItem` document includes a // `product` field. This `flatMap()` call extracts the `id` and // `name` of each product, and flattens the resulting Set. .flatMap(.items.map(.product) { id, name } ) // Deduplicates the Set of product `id` and `name` values so that // it only returns unique elements. .distinct() ``` ``` { data: [ { id: "111", name: "cups" }, { id: "222", name: "donkey pinata" }, { id: "333", name: "pizza" } ] } ``` # `set.drop()` | Learn: Sets | | --- | --- | --- | Drop the first _N_ elements of a [Set](../../../fql/types/#set). | Loading strategy: | Lazy loading | | --- | --- | --- | --- | ## [](#signature)Signature ```fql-sig drop(amount: Number) => Set ``` ## [](#description)Description Drops a provided number of elements from a Set, beginning at element\[0\]. This lets you "skip" the elements. The calling Set isn’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | amount | Number | true | Number of elements to drop. If this value is greater than the Set length, an empty Set is returned. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | A Set with the elements dropped. | ## [](#examples)Examples ```fql // `toSet()` converts an Array to a Set. let set = [1, 2, 3, 4, 5].toSet() set.drop(2) ``` ``` { data: [ 3, 4, 5 ] } ``` # `set.eventsOn()` | Learn: Sets | | --- | --- | --- | Create an [event source](../../../../learn/cdc/) that tracks changes to specified document fields in a [supported Set](../../../../learn/cdc/#sets). ## [](#signature)Signature ```fql-sig eventsOn(fields: ...A => Any) => EventSource ``` ## [](#description)Description Creates an [event source token](../../../../learn/cdc/) that tracks changes to specified document fields in a [Set](../../../fql/types/#set). The token has the [EventSource](../../../fql/types/#event-source) type: ``` "g9WD1YPG..." ``` You can only call `eventsOn()` on a [supported Set](../../../../learn/cdc/#sets). The exact behavior of the method depends on this source. The calling Set isn’t changed. [Sets](../../../fql/types/#set) for event sources support a limited number of [transformations and filters](../../../../learn/cdc/#transformations-filters). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | fields | Any | true | Comma-separated list of document field accessors (using dot notation). The event source tracks changes to the values of these document field. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | EventSource | A string-encoded token that represents the event source. Use the token to consume events as an event feed or event stream. | ## [](#examples)Examples ### [](#collection-event-sources)Collection event sources You can use `eventsOn()` to only track changes to specific fields for documents in a collection. ```fql Product.all().eventsOn(.description) ``` The query returns an event source. ```json "g9WD1YPG..." ``` ### [](#index-event-sources)Index event sources Event sources for indexes only send events for changes to the index’s `terms` or `values` fields. For example, the following `Product` collection’s `byName()` index has: * A `terms` field of `name` * `values` fields of `name` and `price` ```fsl collection Product { *: Any index byName { terms [.name] values [desc(.stock)] } ... } ``` When called on an index, `eventsOn()` only accepts the index’s `terms` or `values` fields as arguments. For example, in the following query, `eventsOn()` only accepts `.name` and `.stock` as arguments. ```fql Product.byName("limes").eventsOn(.name, .stock) ``` ### [](#document-event-sources)Document event sources You can use `eventsOn()` to track changes to a [Set](../../../fql/types/#set) containing a single document. The following query only tracks changes to the `name` or `price` field of a single document. ```fql let product = Product.byId("111")! Set.single(product).eventsOn(.name, .price) ``` ## [](#see-also)See also * [`set.eventSource()`](../eventsource/) * [Event feeds and event streams](../../../../learn/cdc/) # `set.eventSource()` | Learn: Sets | | --- | --- | --- | Create an [event source](../../../../learn/cdc/) that tracks changes to documents in a [supported Set](../../../../learn/cdc/#sets). ## [](#signature)Signature ```fql-sig eventSource() => EventSource ``` ## [](#description)Description Creates an [event source token](../../../../learn/cdc/) that tracks changes to documents in a [supported Set](../../../../learn/cdc/#sets). The token has the [EventSource](../../../fql/types/#event-source) type: ``` "g9WD1YPG..." ``` You can only call `eventSource()` on a [supported Set](../../../../learn/cdc/#sets). The exact behavior of the method depends on this source. The calling [Set](../../../fql/types/#set) isn’t changed. Source [Set](../../../fql/types/#set)s for event sources support a limited number of [transformations and filters](../../../../learn/cdc/#transformations-filters). ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | EventSource | A string-encoded token that represents the event source. Use the token to consume events as an event feed or event stream. | ## [](#examples)Examples ### [](#collection-event-sources)Collection event sources Calling `eventSource()` directly on [`collection.all()`](../../collection/instance-all/) tracks any change to any document in the collection. ```fql Product.all().eventSource() ``` The query returns an event source: ``` "g9WD1YPG..." ``` ### [](#index-event-sources)Index event sources Event sources for indexes only send events for changes to the index’s `terms` or `values` fields. For example, the following `Product` collection’s `byName()` index has: * A `terms` field of `name` * `values` fields of `name` and `price` ```fsl collection Product { *: Any index byName { terms [.name] values [.name, .price] } ... } ``` The following query only tracks changes to the `name` or `price` fields for `Product` documents with a `name` of `limes`. ```fql Product.byName("limes").eventSource() ``` ### [](#document-event-sources)Document event sources You can use event sources to track changes to a [Set](../../../fql/types/#set) containing a single document. These event sources are only sent events when the document changes. ```fql // Uses the `Product` collection's `byName()` index and // the `first()` method to get a single document. let product = Product.byName("cups").first() Set.single(product).eventSource() ``` ## [](#see-also)See also * [`set.eventsOn()`](../eventson/) * [Event feeds and event streams](../../../../learn/cdc/) # `set.every()` | Learn: Sets | | --- | --- | --- | Test if every element of a Set matches a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig every(predicate: (A => Boolean | Null)) => Boolean ``` ## [](#description)Description Tests if every element of the calling Set matches a provided [predicate function](../../../fql/functions/#predicates). ### [](#eager-loading)Eager loading This method uses [eager loading](../#eager) and requires a read of each document in the calling Set. For large Sets, this may result in poor performance and high costs. Performance hint: `full_set_read` Queries that call this method on a document Set emit a [performance hint](../../../http/reference/query-summary/#perf), if enabled. For example, the following query: ```fql // Use `every()` to get all products with // more than 20 items in stock. Product.all().every(.stock > 20) ``` Emits the following hint: ``` performance_hint: full_set_read - Using every() causes the full set to be read. See https://docs.faunadb.org/performance_hint/full_set_read. at *query*:3:20 | 3 | Product.all().every(.stock > 20) | ^^^^^^^^^^^^^ | ``` To address the hint, use [`set.take()`](../take/) to explicitly limit the size of the calling Set to fewer than 100 documents: ```fql // Limit the doc Set's size using `take()` Product.all().take(20).every(.stock > 20) ``` This applies even if the original, unbounded Set contains fewer than 100 documents. Alternatively, you can rewrite the query to avoid calling the method. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | predicate | Predicate function | true | Anonymous predicate function that:Accepts a Set element as its only argument. Supports shorthand-syntax for objects and documents.Returns Boolean or Null.The method returns true if the predicate is true for every element in the Set. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the predicate evaluates to true for every element of the Set. Otherwise, false. | ## [](#examples)Examples ```fql // `toSet()` converts an Array to a Set. let set = [1, -2, 3].toSet() set.every(v => v > 0) ``` ``` false ``` # `set.first()` | Learn: Sets | | --- | --- | --- | Get the first element of a [Set](../../../fql/types/#set). ## [](#signature)Signature ```fql-sig first() => A | Null ``` ## [](#description)Description Gets the first element of the calling [Set](../../../fql/types/#set). ## [](#parameters)Parameters None ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Generic | First element of the calling Set. | | Null | Returned if the Set is empty. | ## [](#examples)Examples Get the first Customers document: ```fql Customer.all().first() ``` ``` { id: "111", coll: Customer, ts: Time("2099-07-31T12:42:19Z"), cart: Order("412483941752112205"), orders: "hdW...", name: "Alice Appleseed", email: "alice.appleseed@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } } ``` # `set.firstWhere()` | Learn: Sets | | --- | --- | --- | Get the first element of a [Set](../../../fql/types/#set) that matches a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig firstWhere(predicate: (A => Boolean | Null)) => A | Null ``` ## [](#description)Description Gets the first element of the calling Set that matches a provided [predicate function](../../../fql/functions/#predicates). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | predicate | Predicate function | true | Anonymous predicate function that:Accepts a Set element as its only argument. Supports shorthand-syntax for objects and documents.Returns Boolean or Null.The method returns the first Set element for which the predicate returns true. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Generic | First element of the Set that matches the predicate. | | Null | Returned if no Set element matches the predicate or the Set is empty. | ## [](#examples)Examples Get the first Customers document where the `state` property is `DC`: ```fql Customer.all().firstWhere(.address.state == 'DC') ``` ``` { id: "111", coll: Customer, ts: Time("2099-07-31T12:42:19Z"), cart: Order("412571379960906240"), orders: "hdW...", name: "Alice Appleseed", email: "alice.appleseed@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } } ``` # `set.flatMap()` | Learn: Sets | | --- | --- | --- | Apply a provided [function](../../../fql/functions/) to each [Set](../../../fql/types/#set) element and flatten the resulting Set by one level. | Loading strategy: | Lazy loading | | --- | --- | --- | --- | ## [](#signature)Signature ```fql-sig flatMap(mapper: (A => Set)) => Set ``` ## [](#description)Description Creates a [Set](../../../fql/types/#set) by invoking a provided mapper [function](../../../fql/functions/) on each element of the calling Set and flattening the resulting Set one level. The [Set](../../../fql/types/#set) elements are passed as a parameter to the mapper function, sequentially. The calling Set isn’t changed. ### [](#iterator-methods)Iterator methods FQL provides several methods for iterating over a Set. [`set.forEach()`](../foreach/), [`set.map()`](../map/), [`set.flatMap()`](./) are similar but used for different purposes: | Method | Primary use | Notes | | --- | --- | --- | --- | --- | | set.forEach() | Perform in-place writes on Set elements. | Doesn’t return a value. | | set.map() | Returns a new Set. | Can’t perform writes. | | set.flatMap() | Similar to set.map(), but flattens the resulting Set by one level. | Can’t perform writes. | For examples, see: * [`set.forEach()` vs. `set.map()`](../map/#foreach-vs-map) * [`set.map()` vs. `set.flatMap()`](#map-vs-flatmap) ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | mapper | Function | true | Function to invoke on each element of the calling Set. Each element is passed to the mapper function as an argument. The function must return a Set. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set containing the result of invoking the mapper function on each element of the calling Set. The resulting Set is flattened by one level. | ## [](#examples)Examples ### [](#basic-example)Basic example ```fql // Returns a Set let topCustomer = Customer.byEmail("alice.appleseed@example.com") // Returns a flattened Set topCustomer.flatMap((customer) => Order.byCustomer(customer)) ``` ``` { data: [ { id: "410674590662000717", coll: Order, ts: Time("2099-10-02T22:37:39.583Z"), items: "hdW...", total: 5392, status: "cart", customer: Customer("111"), createdAt: Time("2099-10-02T22:37:39.434810Z"), payment: {} } ] } ``` ### [](#map-vs-flatmap)`set.map()` vs. `set.flatMap()` [`set.flatMap()`](./) is similar to [`set.map()`](../map/), except [`set.flatMap()`](./) also flattens the resulting Set by one level. In the following example, [`set.map()`](../map/) returns a two-dimensional Set: ```fql // Get a Set of all `Category` collection documents. let categories = Category.all() // Use `map()` to get a Set of `Product` documents // for each category. categories.map(category => { Product.byCategory(category) }) ``` ``` // Two-dimensional Set. { data: [ { data: [ { id: "111", coll: Product, ts: Time("2099-10-02T22:37:39.583Z"), name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698, stock: 100, category: Category("123") }, ... ] }, { data: [ { id: "333", coll: Product, ts: Time("2099-10-02T22:37:39.583Z"), name: "pizza", description: "Frozen Cheese", price: 499, stock: 100, category: Category("456") } ] }, { data: [ { id: "444", coll: Product, ts: Time("2099-10-02T22:37:39.583Z"), name: "avocados", description: "Conventional Hass, 4ct bag", price: 399, stock: 1000, category: Category("789") }, ... ] } ] } ``` To flatten the result to a one-dimensional array, use [`set.flatMap()`](./) instead: ```fql // Get a Set of all `Category` collection documents. let categories = Category.all() // Use `flatMap()` to get a Set of `Product` documents // for each category. Then flatten the resulting Set. categories.flatMap(category => { Product.byCategory(category) }) ``` ``` // One-dimensional Set. { data: [ { id: "111", coll: Product, ts: Time("2099-10-02T22:37:39.583Z"), name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698, stock: 100, category: Category("123") }, ... { id: "333", coll: Product, ts: Time("2099-10-02T22:37:39.583Z"), name: "pizza", description: "Frozen Cheese", price: 499, stock: 100, category: Category("456") }, ... { id: "444", coll: Product, ts: Time("2099-10-02T22:37:39.583Z"), name: "avocados", description: "Conventional Hass, 4ct bag", price: 399, stock: 1000, category: Category("789") } ] } ``` # `set.fold()` | Learn: Sets | | --- | --- | --- | Reduce the [Set](../../../fql/types/#set) to a single, accumulated value by applying a provided [function](../../../fql/functions/) to each element. Iterates through elements from left to right. Uses a provided seed as the initial value. ## [](#signature)Signature ```fql-sig fold(seed: B, reducer: (B, A) => B) => B ``` ## [](#description)Description Iterates through each element in a Set to perform a rolling operation. For example, you can use `fold()` to calculate a rolling sum, concatenate elements, or perform complex transformations. `fold()` calls a reducer callback function on every element of the Set from left to right. The reducer function takes two arguments: * The accumulator that holds the running result from previous iterations. For the first iteration, a seed value serves as the initial accumulator. * The current element’s value from the Set. The method returns the result of the last iteration. The calling Set isn’t changed. ### [](#eager-loading)Eager loading This method uses [eager loading](../#eager) and requires a read of each document in the calling Set. For large Sets, this may result in poor performance and high costs. Performance hint: `full_set_read` Queries that call this method on a document Set emit a [performance hint](../../../http/reference/query-summary/#perf), if enabled. For example, the following query: ```fql // Use `fold()` to sum stock counts let stockCounts = Product.all().map(doc => doc.stock) stockCounts.fold(0, (a, b) => a + b) ``` Emits the following hint: ``` performance_hint: full_set_read - Using fold() causes the full set to be read. See https://docs.faunadb.org/performance_hint/full_set_read. at *query*:3:17 | 3 | stockCounts.fold(0, (a, b) => a + b) | ^^^^^^^^^^^^^^^^^^^^ | ``` To address the hint, use [`set.take()`](../take/) to explicitly limit the size of the calling Set to fewer than 100 documents: ```fql // Limit the doc Set's size using `take()` let stockCounts = Product.all().take(20).map(doc => doc.stock) stockCounts.fold(0, (a, b) => a + b) ``` This applies even if the original, unbounded Set contains fewer than 100 documents. Alternatively, you can rewrite the query to avoid calling the method. ### [](#fold-family-methods)Fold family methods FQL supports several methods for [folds](https://en.wikipedia.org/wiki/Fold_\(higher-order_function\)), which iteratively reduce a Set to a single value. These methods include: * [`set.fold()`](./) * [`set.foldRight()`](../foldright/) * [`set.reduce()`](../reduce/) * [`set.reduceRight()`](../reduceright/) The methods are similar but have the following differences: * [`set.fold()`](./) and [`set.foldRight()`](../foldright/) accept an initial _seed_ value and use it as the initial _accumulator_. [`set.reduce()`](../reduce/) and [`set.reduceRight()`](../reduceright/) use the Set’s first element as the initial _accumulator_. * [`set.fold()`](./) and [`set.reduce()`](../reduce/) iterate through the Set’s elements from left to right. [`set.foldRight()`](../foldright/) and [`set.reduceRight()`](../reduceright/) iterate through the Set’s elements from right to left. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | seed | Generic | true | Initial accumulator value provided to the reducer function. | | reducer | Function | true | Anonymous FQL function to call on each element of the Set. | ### [](#reducer-function-arguments)Reducer function arguments: | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | accumulator | Generic | true | Value returned by the previous reducer function call. On the first call, seed is passed as the accumulator. | | current | Generic | true | The current element’s value. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Generic | Result of the last reducer function call. For an empty Set, the seed is returned. | ## [](#examples)Examples ### [](#basic-example)Basic example ```fql // `toSet()` converts an Array to a Set. let set = [1, 2, 3].toSet() set.fold(100, (value, elem) => value + elem) ``` ``` 106 ``` ### [](#group-by-operation)Group by operation [FQL](../../../../learn/query/) doesn’t provide a built-in `GROUP BY` operation. However, you use `fold()` in an [anonymous FQL function](../../../fql/functions/) or a [user-defined function (UDF)](../../../../learn/schema/user-defined-functions/) to achieve the same result. As an FQL function: ```fql // Defines an anonymous `groupBy()` function. // `groupBy()` two arguments: // * `set`: Set or Array containing data to group // * `key_fn`: Grouping key for each element let groupBy = (set, key_fn) => { // Calls the `fold()` function on the `set` // Set or Array. set.fold( // Start with an empty object. {}, (acc, val) => { // For each value, get a key using the `key_fn` arg. let key = key_fn(val) let existing_group = acc[key] ?? [] // Append the current value to the Set or // Array for that key. let new_group = existing_group.append(val) let new_entry = Object.fromEntries([ [key, new_group] ]) // Return an object with grouped results. Object.assign(acc, new_entry) } ) } // Call the `groupBy()` function. // Groups `Product` documents by category name. groupBy(Product.all(), .category!.name) ``` You can also define a `groupBy()` UDF. This lets you reuse the function across multiple queries. You create and manage a UDF as an FSL [function schema](../../../fsl/function/): ```fsl // Defines the `groupBy()` UDF. // `groupBy()` two arguments: // * `set`: Set or Array containing data to group // * `key_fn`: Grouping key for each element function groupBy (set, key_fn) { // Calls the `fold()` function on the `set` // Set or Array. set.fold( // Start with an empty object. {}, (acc, val) => { // For each value, get a key using the `key_fn` arg. let key: String = key_fn(val) let existing_group = acc[key] ?? [] // Append the current value to the Set or // Array for that key. let new_group = existing_group.append(val) let new_entry = Object.fromEntries([ [key, new_group] ]) // Return an object with grouped results. Object.assign(acc, new_entry) } ) } ``` You can create and manage schema using any of the following: * The [Fauna CLI](../../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../../http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../../learn/schema/manage-schema/#fql) For additional examples using the `groupBy()` UDF, see [Group By: Aggregate data in Fauna](../../../../learn/query/patterns/group-by/). # `set.foldRight()` | Learn: Sets | | --- | --- | --- | Reduce a [Set](../../../fql/types/#set) to a single, accumulated value by applying a provided [function](../../../fql/functions/) to each element. Iterates through elements from right to left. Uses a provided seed as the initial value. ## [](#signature)Signature ```fql-sig foldRight(seed: B, reducer: (B, A) => B) => B ``` ## [](#description)Description Iterates through each element in a Set to perform a rolling operation. For example, you can use `foldRight()` to calculate a rolling sum, concatenate elements, or perform complex transformations. `foldRight()` calls a reducer callback function on every element of the Set from right to left. The reducer function takes two arguments: * The accumulator that holds the running result from previous iterations. For the first iteration, a seed value serves as the initial accumulator. * The current element’s value from the Set. The method returns the result of the last iteration. The calling Set isn’t changed. ### [](#eager-loading)Eager loading This method uses [eager loading](../#eager) and requires a read of each document in the calling Set. For large Sets, this may result in poor performance and high costs. Performance hint: `full_set_read` Queries that call this method on a document Set emit a [performance hint](../../../http/reference/query-summary/#perf), if enabled. For example, the following query: ```fql // Use `foldRight()` to sum stock counts let stockCounts = Product.all().map(doc => doc.stock) stockCounts.foldRight(0, (a, b) => a + b) ``` Emits the following hint: ``` performance_hint: full_set_read - Using foldRight() causes the full set to be read. See https://docs.faunadb.org/performance_hint/full_set_read. at *query*:3:22 | 3 | stockCounts.foldRight(0, (a, b) => a + b) | ^^^^^^^^^^^^^^^^^^^^ | ``` To address the hint, use [`set.take()`](../take/) to explicitly limit the size of the calling Set to fewer than 100 documents: ```fql // Limit the doc Set's size using `take()` let stockCounts = Product.all().take(20).map(doc => doc.stock) stockCounts.foldRight(0, (a, b) => a + b) ``` This applies even if the original, unbounded Set contains fewer than 100 documents. Alternatively, you can rewrite the query to avoid calling the method. ### [](#fold-family-methods)Fold family methods FQL supports several methods for [folds](https://en.wikipedia.org/wiki/Fold_\(higher-order_function\)), which iteratively reduce a Set to a single value. These methods include: * [`set.fold()`](../fold/) * [`set.foldRight()`](./) * [`set.reduce()`](../reduce/) * [`set.reduceRight()`](../reduceright/) The methods are similar but have the following differences: * [`set.fold()`](../fold/) and [`set.foldRight()`](./) accept an initial _seed_ value and use it as the initial _accumulator_. [`set.reduce()`](../reduce/) and [`set.reduceRight()`](../reduceright/) use the Set’s first element as the initial _accumulator_. * [`set.fold()`](../fold/) and [`set.reduce()`](../reduce/) iterate through the Set’s elements from left to right. [`set.foldRight()`](./) and [`set.reduceRight()`](../reduceright/) iterate through the Set’s elements from right to left. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | seed | Generic | true | Initial accumulator value provided to the reducer function. | | reducer | Function | true | Anonymous FQL function to call on each element of the Set. | ### [](#reducer-function-arguments)Reducer function arguments: | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | accumulator | Generic | true | Value returned by the previous reducer function call. On the first call, seed is passed as the accumulator. | | current | Generic | true | The current element’s value. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Generic | Result of the last reducer function call. For an empty Set, the seed is returned. | ## [](#examples)Examples ```fql // `toSet()` converts an Array to a Set. let set = [1, 2, 3].toSet() set.foldRight(100, (value, elem) => value + elem) ``` ``` 106 ``` # `set.forEach()` | Learn: Sets | | --- | --- | --- | Run a provided [function](../../../fql/functions/) on each element of a [Set](../../../fql/types/#set). Can perform writes. ## [](#signature)Signature ```fql-sig forEach(callback: (A => Any)) => Null ``` ## [](#description)Description Iterates over all elements in the [Set](../../../fql/types/#set) and executes a provided callback function on each element. It is used for mutations or writing documents based on each [Set](../../../fql/types/#set) element. ### [](#eager-loading)Eager loading This method uses [eager loading](../#eager) and requires a read of each document in the calling Set. For large Sets, this may result in poor performance and high costs. Performance hint: `full_set_read` Queries that call this method on a document Set emit a [performance hint](../../../http/reference/query-summary/#perf), if enabled. For example, the following query: ```fql // Use `forEach()` to increment each product's // stock count by 1. Product.all().forEach( doc => doc.stock + 1 ) ``` Emits the following hint: ``` performance_hint: full_set_read - Using forEach() causes the full set to be read. See https://docs.faunadb.org/performance_hint/full_set_read. at *query*:3:22 | 3 | Product.all().forEach( | ______________________^ 4 | | doc => doc.stock + 1 5 | | ) | |_^ | ``` To address the hint, use [`set.take()`](../take/) to explicitly limit the size of the calling Set to fewer than 100 documents: ```fql // Limit the doc Set's size using `take()` Product.all().take(20).forEach( doc => doc.stock + 1 ) ``` This applies even if the original, unbounded Set contains fewer than 100 documents. Alternatively, you can rewrite the query to avoid calling the method. ### [](#avoid-use-on-large-sets)Avoid use on large sets This method scans the full [Set](../../../fql/types/#set), which can cause many reads and might time out for large [Set](../../../fql/types/#set)s. ### [](#iterator-methods)Iterator methods FQL provides several methods for iterating over a Set. [`set.forEach()`](./), [`set.map()`](../map/), [`set.flatMap()`](../flatmap/) are similar but used for different purposes: | Method | Primary use | Notes | | --- | --- | --- | --- | --- | | set.forEach() | Perform in-place writes on Set elements. | Doesn’t return a value. | | set.map() | Returns a new Set. | Can’t perform writes. | | set.flatMap() | Similar to set.map(), but flattens the resulting Set by one level. | Can’t perform writes. | For examples, see: * [`set.forEach()` vs. `set.map()`](#foreach-vs-map) * [`set.map()` vs. `set.flatMap()`](../map/#map-vs-flatmap) ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | callback | Function | true | Anonymous FQL function to call on each element of the Set. Each call returns Null. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Null | This method always returns Null. | ## [](#examples)Examples ### [](#basic-example)Basic example ```fql // Get a Set of `Customer` collection documents with an // `address` in the `state` of `DC`. let customers = Customer.where(.address?.state == "DC") // Use `forEach()` to update each document in the previous Set. customers.forEach(doc => doc.update({ address: { street: doc?.address?.street, city: doc?.address?.city, state: "District of Columbia", postalCode: doc?.address?.postalCode, country: doc?.address?.country, } })) // `forEach()` returns `null`. ``` ``` null ``` Although it returns `null`, [`set.forEach()`](./) still performed the requested operations. To verify: ```fql // Get all `Customer` collection documents Customer.all() ``` ``` // The results contain `Customer` documents updated by // the previous `forEach()` call. { data: [ { id: "111", coll: Customer, ts: Time("2099-10-02T21:50:14.555Z"), cart: Order("410671593745809485"), orders: "hdW...", name: "Alice Appleseed", email: "alice.appleseed@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "District of Columbia", // `state` has been updated. postalCode: "20220", country: "US" } }, ... ] } ``` ### [](#foreach-vs-map)`set.forEach()` vs. `set.map()` You can use both [`set.forEach()`](./) and [`set.map()`](../map/) to iterate through a Set. Use [`set.forEach()`](./) to perform in-place writes on the calling Set: ```fql // Gets the frozen category. let frozen = Category.byName("frozen").first() // Uses `forEach()` to delete each product in // the frozen category. Product.byCategory(frozen).forEach(product => { product.delete() }) ``` ``` null ``` Although it returns `null`, [`set.forEach()`](./) still performs the requested operations. Unlike [`set.forEach()`](./), [`set.map()`](../map/) can’t perform writes: ```fql // Gets the produce category. let produce = Category.byName("produce").first() // Attempts to use `map()` to delete each product in // the produce category. Product.byCategory(produce).map(product => { product.delete() }) ``` ``` invalid_effect: `delete` performs a write, which is not allowed in set functions. error: `delete` performs a write, which is not allowed in set functions. at *query*:7:17 | 7 | product.delete() | ^^ | ``` Instead, you can use [`set.map()`](../map/) to output a new Set containing extracted or transformed values: ```fql // Gets the produce category. let produce = Category.byName("produce").first() // Uses `map()` to outputs a new Set containing products in // the produce category. The new Set transforms each product's // name. Product.byCategory(produce).map(product => { let product: Any = product { name: product.category.name + ": " + product.name, } }) ``` ``` { data: [ { name: "produce: avocados" }, { name: "produce: single lime" }, { name: "produce: organic limes" }, { name: "produce: limes" }, { name: "produce: cilantro" } ] } ``` # `set.includes()` | Learn: Sets | | --- | --- | --- | Test if the [Set](../../../fql/types/#set) includes a provided element. ## [](#signature)Signature ```fql-sig includes(element: A) => Boolean ``` ## [](#description)Description Tests if the [Set](../../../fql/types/#set) includes a provided element. ### [](#eager-loading)Eager loading This method uses [eager loading](../#eager) and requires a read of each document in the calling Set. For large Sets, this may result in poor performance and high costs. Performance hint: `full_set_read` Queries that call this method on a document Set emit a [performance hint](../../../http/reference/query-summary/#perf), if enabled. For example, the following query: ```fql // Use `includes()` to check whether the `Product` // collection includes a specific document. let limes: Any = Product.byId("777") Product.all().includes(limes) ``` Emits the following hint: ``` performance_hint: full_set_read - Using includes() causes the full set to be read. See https://docs.faunadb.org/performance_hint/full_set_read. at *query*:4:23 | 4 | Product.all().includes(limes) | ^^^^^^^ | ``` To address the hint, use [`set.take()`](../take/) to explicitly limit the size of the calling Set to fewer than 100 documents: ```fql // Limit the doc Set's size using `take()` let limes: Any = Product.byId("777") Product.all().take(20).includes(limes) ``` This applies even if the original, unbounded Set contains fewer than 100 documents. Alternatively, you can rewrite the query to avoid calling the method. ### [](#avoid-filtering-using-includes)Avoid filtering using `includes()` In most cases, you should avoid using [`set.includes()`](./) to intersect results, including results from [covered index calls](../../../../learn/data-model/indexes/#covered-queries). [`set.includes()`](./) is a linear operation. The compute costs consumed by repeatedly iterating through results will typically exceed the read costs of directly reading documents. For example, the following query is inefficient and will likely incur high compute costs: ```fql // Each variable is a covered index call let limes = Product.byName("limes") let produce = Product.byCategory(Category.byName("produce").first()!) let under5 = Product.sortedByPriceLowToHigh({ to: 500 }) // Uses `includes()` to intersect the results from // covered index calls limes.where(doc => produce.includes(doc)) .where(doc => under5.includes(doc)) ``` Instead, use a [covered index call](../../../../learn/data-model/indexes/#covered-queries) and [`set.where()`](../where/) to filter the results as outlined in [Filter using `where()`](../../../../learn/query/patterns/sets/#where). For example, you can rewrite the previous query as: ```fql // Start with a covered index call. Product.byName("limes") // Layer on filters using `where()` .where(doc => doc.category == Category.byName("produce").first()!) .where(doc => doc.price < 500 ) ``` ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | element | Generic | true | Element to check the Set for. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the Set contains the provided element. Otherwise, false. | ## [](#examples)Examples ```fql // `toSet()` converts an Array to a Set. let set = [1, 2, 3].toSet() set.includes(2) ``` ``` true ``` # `set.isEmpty()` | Learn: Sets | | --- | --- | --- | Test if a [Set](../../../fql/types/#set) is empty. ## [](#signature)Signature ```fql-sig isEmpty() => Boolean ``` ## [](#description)Description Tests if the calling [Set](../../../fql/types/#set) is empty and contains no elements. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the Set is empty. Otherwise, false. | ## [](#examples)Examples ```fql // `toSet()` converts an Array to a Set. // In thise case, it creates an empty Set. let set = [].toSet() set.isEmpty() ``` ``` true ``` # `set.last()` | Learn: Sets | | --- | --- | --- | Get the last element of a [Set](../../../fql/types/#set). ## [](#signature)Signature ```fql-sig last() => A | Null ``` ## [](#description)Description Returns the last element of the [Set](../../../fql/types/#set). The method reverses the Set and gets the first item. ## [](#parameters)Parameters None ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Generic | Last element of the Set. | | Null | Returned if the Set is empty. | ## [](#examples)Examples ```fql // `toSet()` converts an Array to a Set. let set = [1, 2].toSet() set.last() ``` ``` 2 ``` ## [](#see-also)See also [`set.first()`](../first/) [`set.reverse()`](../reverse/) # `set.lastWhere()` | Learn: Sets | | --- | --- | --- | Get the last element of a [Set](../../../fql/types/#set) that matches a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig lastWhere(predicate: (A => Boolean | Null)) => A | Null ``` ## [](#description)Description Gets the last element of the calling Set that matches a provided [predicate function](../../../fql/functions/#predicates). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | predicate | Predicate function | true | Anonymous predicate function that:Accepts a Set element as its only argument. Supports shorthand-syntax for objects and documents.Returns a Boolean value.The method returns the last Set element for which the predicate returns true. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Generic | Last element of the Set that matches the predicate. | | Null | Returned if no Set element matches the predicate or the Set is empty. | ## [](#examples)Examples ```fql // `toSet()` converts an Array to a Set. let set = [1, 2, 3, 4].toSet() set.lastWhere(v => v > 2) ``` ``` 4 ``` # `set.map()` | Learn: Sets | | --- | --- | --- | Apply a provided [function](../../../fql/functions/) to each element of a [Set](../../../fql/types/#set). Can’t perform writes. | Loading strategy: | Lazy loading | | --- | --- | --- | --- | ## [](#signature)Signature ```fql-sig map(mapper: (A => B)) => Set ``` ## [](#description)Description Creates a [Set](../../../fql/types/#set) by applying a mapper function to each element in the calling [Set](../../../fql/types/#set). Writes are not permitted. The calling Set isn’t changed. If `map()` is the last value in a query, the first page of the new [Set](../../../fql/types/#set) is returned. ### [](#iterator-methods)Iterator methods FQL provides several methods for iterating over a Set. [`set.forEach()`](../foreach/), [`set.map()`](./), [`set.flatMap()`](../flatmap/) are similar but used for different purposes: | Method | Primary use | Notes | | --- | --- | --- | --- | --- | | set.forEach() | Perform in-place writes on Set elements. | Doesn’t return a value. | | set.map() | Returns a new Set. | Can’t perform writes. | | set.flatMap() | Similar to set.map(), but flattens the resulting Set by one level. | Can’t perform writes. | For examples, see: * [`set.forEach()` vs. `set.map()`](#foreach-vs-map) * [`set.map()` vs. `set.flatMap()`](#map-vs-flatmap) ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | mapper | Function | true | Anonymous FQL function to call on each element of the calling Set. Each call returns a value that’s returned in the result Set. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set containing values returned by each mapper function call. | ## [](#examples)Examples ### [](#basic-example)Basic example For all Customer documents, combine the `address.city` and `address.state` properties into a single string: ```fql Customer.all().map( customer => { name: customer.name, city: "#{customer.address.city}, #{customer.address.state}" } ) ``` ``` { data: [ { name: "Alice Appleseed", city: "Washington, DC" }, { name: "Bob Brown", city: "Washington, DC" }, { name: "Carol Clark", city: "Washington, DC" } ] } ``` ### [](#project)Project Sets using `set.map()` You can use `set.map()` to project a Set, similar to using [Set projection](../../../fql/projection/#set). For example, the following projection query: ```fql Product.sortedByPriceLowToHigh() { name, description, price } ``` Is equivalent to the following `set.map()` query: ```fql Product.sortedByPriceLowToHigh().map(prod => { name: prod.name, description: prod.description, price: prod.price, }) ``` ### [](#foreach-vs-map)`set.forEach()` vs. `set.map()` You can use both [`set.forEach()`](../foreach/) and [`set.map()`](./) to iterate through a Set. Use [`set.forEach()`](../foreach/) to perform in-place writes on the calling Set: ```fql // Gets the frozen category. let frozen = Category.byName("frozen").first() // Uses `forEach()` to delete each product in // the frozen category. Product.byCategory(frozen).forEach(product => { product.delete() }) ``` ``` null ``` Although it returns `null`, [`set.forEach()`](../foreach/) still performs the requested operations. Unlike [`set.forEach()`](../foreach/), [`set.map()`](./) can’t perform writes: ```fql // Gets the produce category. let produce = Category.byName("produce").first() // Attempts to use `map()` to delete each product in // the produce category. Product.byCategory(produce).map(product => { product.delete() }) ``` ``` invalid_effect: `delete` performs a write, which is not allowed in set functions. error: `delete` performs a write, which is not allowed in set functions. at *query*:7:17 | 7 | product.delete() | ^^ | ``` Instead, you can use [`set.map()`](./) to output a new Set containing extracted or transformed values: ```fql // Gets the produce category. let produce = Category.byName("produce").first() // Uses `map()` to outputs a new Set containing products in // the produce category. The new Set transforms each product's // name. Product.byCategory(produce).map(product => { let product: Any = product { name: product.category.name + ": " + product.name, } }) ``` ``` { data: [ { name: "produce: avocados" }, { name: "produce: single lime" }, { name: "produce: organic limes" }, { name: "produce: limes" }, { name: "produce: cilantro" } ] } ``` ### [](#map-vs-flatmap)`set.map()` vs. `set.flatMap()` [`set.flatMap()`](../flatmap/) is similar to [`set.map()`](./), except [`set.flatMap()`](../flatmap/) also flattens the resulting Set by one level. In the following example, [`set.map()`](./) returns a two-dimensional Set: ```fql // Get a Set of all `Category` collection documents. let categories = Category.all() // Use `map()` to get a Set of `Product` documents // for each category. categories.map(category => { Product.byCategory(category) }) ``` ``` // Two-dimensional Set. { data: [ { data: [ { id: "111", coll: Product, ts: Time("2099-10-02T22:37:39.583Z"), name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698, stock: 100, category: Category("123") }, ... ] }, { data: [ { id: "333", coll: Product, ts: Time("2099-10-02T22:37:39.583Z"), name: "pizza", description: "Frozen Cheese", price: 499, stock: 100, category: Category("456") } ] }, { data: [ { id: "444", coll: Product, ts: Time("2099-10-02T22:37:39.583Z"), name: "avocados", description: "Conventional Hass, 4ct bag", price: 399, stock: 1000, category: Category("789") }, ... ] } ] } ``` To flatten the result to a one-dimensional array, use [`set.flatMap()`](../flatmap/) instead: ```fql // Get a Set of all `Category` collection documents. let categories = Category.all() // Use `flatMap()` to get a Set of `Product` documents // for each category. Then flatten the resulting Set. categories.flatMap(category => { Product.byCategory(category) }) ``` ``` // One-dimensional Set. { data: [ { id: "111", coll: Product, ts: Time("2099-10-02T22:37:39.583Z"), name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698, stock: 100, category: Category("123") }, ... { id: "333", coll: Product, ts: Time("2099-10-02T22:37:39.583Z"), name: "pizza", description: "Frozen Cheese", price: 499, stock: 100, category: Category("456") }, ... { id: "444", coll: Product, ts: Time("2099-10-02T22:37:39.583Z"), name: "avocados", description: "Conventional Hass, 4ct bag", price: 399, stock: 1000, category: Category("789") } ] } ``` # `set.nonEmpty()` | Learn: Sets | | --- | --- | --- | Test if a [Set](../../../fql/types/#set) is not empty. ## [](#signature)Signature ```fql-sig nonEmpty() => Boolean ``` ## [](#description)Description Tests if the calling [Set](../../../fql/types/#set) is not empty. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the Set is not empty. Otherwise, false. | ## [](#examples)Examples ```fql // `toSet()` converts an Array to a Set. // In thise case, it creates an empty Set. let set = [].toSet() set.nonEmpty() ``` ``` false ``` # `set.order()` | Learn: Sets | | --- | --- | --- | Sort a [Set](../../../fql/types/#set)'s elements. ## [](#signature)Signature ```fql-sig order(ordering: ...(A => Any) & {}) => Set ``` ## [](#description)Description Creates a sorted [Set](../../../fql/types/#set) by applying one or more sorting criteria to the calling [Set](../../../fql/types/#set). You define each sorting criterion by wrapping `asc()` (ascending) or `desc()` (descending) around a [field accessor](../../../fql/dot-notation/) or a read-only [anonymous function](../../../fql/functions/). The first criterion has the highest sorting priority, with priority decreasing for each subsequent criterion. If `order()` is the last value in an expression, the first page of the new [Set](../../../fql/types/#set) is returned. See [Pagination](../../../../learn/query/pagination/). The calling Set remains unchanged. ### [](#using-order-with-indexes)Using `order()` with indexes Calling `order()` on an [index](../../../../learn/data-model/indexes/), including the built-in [`all()`](../../collection/instance-all/) index, requires a read of each document and [historical document snapshot](../../../../learn/data-model/indexes/#history) covered by the index. For example: ```fql // Calls `order` on the built-in `all()` index. // The query requires a read of each current `Product` // collection document and any historical document snapshots. Product.all().order(.price) { price } ``` Performance hint: `full_set_read` Queries that call `order()` on an index emit a [performance hint](../../../http/reference/query-summary/#perf), if enabled. For example: ``` performance_hint: full_set_read - Using order() causes the full set to be read. See https://docs.faunadb.org/performance_hint/full_set_read. at *query*:4:20 | 4 | Product.all().order(.price) { price } | ^^^^^^^^ | ``` If you frequently run such queries, consider adding the fields used for ordering to an index definition’s [`values`](../../../../learn/data-model/indexes/#sort-documents). For example: ```fsl collection Product { ... // Adds `price` as an index value. index sortedByPriceLowToHigh { values [.price] } ... } ``` When you call the index, returned documents are sorted by the index’s values. Using index values can significantly reduce the number of read ops required to sort results. ```fql // Get `Product` documents by // Ascending `name`, then ... // Ascending `id` (default) // This is equivalent to the previous query. Product.sortedByPriceLowToHigh() { price } ``` ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | ordering | Generic | | One or more sorting criteria, separated by commas.Each criterion is a field accessor or read-only anonymous function, optionally wrapped in asc() (ascending) or desc() (descending) to indicate sort order.If neither asc() or desc() is provided, asc() is used by default.The anonymous function is passed each Set element as an argument. For document Sets, each Set element is a Document. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | New Set with elements in requested order. | ## [](#examples)Examples ### [](#sort-fields-in-ascending-order)Sort fields in ascending order ```fql // Sort `Customer` collection documents by `name` // in ascending order (default). Customer.all().order(.name) { name, email } ``` ``` { data: [ { name: "Alice Appleseed", email: "alice.appleseed@example.com" }, { name: "Bob Brown", email: "bob.brown@example.com" }, { name: "Carol Clark", email: "carol.clark@example.com" } ] } ``` ### [](#sort-fields-in-descending-order)Sort fields in descending order ```fql // Sort `Customer` collection documents by `name` // in descending order. Customer.all().order(desc(.name)) { name, email } ``` ``` { data: [ { name: "Carol Clark", email: "carol.clark@example.com" }, { name: "Bob Brown", email: "bob.brown@example.com" }, { name: "Alice Appleseed", email: "alice.appleseed@example.com" } ] } ``` ### [](#sort-fields-using-multiple-arguments)Sort fields using multiple arguments ```fql // Sort `Customer` collection documents by: // - Ascending `name` then... // - Ascending `address.street`. Customer.all().order(.name, .address.street) { name, address { street }, email } ``` ``` { data: [ { name: "Alice Appleseed", address: { street: "87856 Mendota Court" }, email: "alice.appleseed@example.com" }, { name: "Bob Brown", address: { street: "72 Waxwing Terrace" }, email: "bob.brown@example.com" }, { name: "Carol Clark", address: { street: "5 Troy Trail" }, email: "carol.clark@example.com" } ] } ``` ### [](#sort-fields-using-an-anonymous-function)Sort fields using an anonymous function In addition to using field accessors, you can use a read-only anonymous function as a sorting criterion: ```fql // Sort Customer collection documents by prioritizing // those where the email contains "example.com". Customer.all().order(asc( (doc) => if (doc.email.includes("example.com")) { // Prioritize these documents. // Lower numbers (`0`)` appear first when sorted // in ascending order. 0 // After that, sort the remaining documents by the // length of their email. } else { doc.email.length } )) { name, email } ``` ``` { data: [ { name: "Alice Appleseed", email: "alice.appleseed@example.com" }, { name: "Bob Brown", email: "bob.brown@example.com" }, { name: "Carol Clark", email: "carol.clark@example.com" }, { name: "Jane Doe", email: "12-fake@fauna.com" }, { name: "John Doe", email: "123-fake@fauna.com" } ] } ``` # `set.pageSize()` | Learn: Pagination | | --- | --- | --- | Set the maximum elements per page in [paginated results](../../../../learn/query/pagination/). | Loading strategy: | Lazy loading | | --- | --- | --- | --- | ## [](#signature)Signature ```fql-sig pageSize(size: Number) => Set ``` ## [](#description)Description Sets the maximum elements per page in [paginated results](../../../../learn/query/pagination/). If a subsequent page is available, the result includes an `after` cursor. To iterate through paginated results, pass the `after` cursor to [`Set.paginate()`](../static-paginate/). ### [](#method-chaining)Method chaining `pageSize()` should typically be the last method call in an FQL expression. `pageSize()` only affects the rendering of a Set, not subsequent operations. Methods chained to `pageSize()` access the entire calling Set, not a page of results. ### [](#after-cursor)`after` cursor See [Cursor state and expiration](../../../../learn/query/pagination/#cursor-state-expiration). ### [](#differences-with-paginate)Differences with `paginate()` The following table outlines differences between [`set.pageSize()`](./) and [`set.paginate()`](../paginate/): | Difference | set.pageSize() | set.paginate() | | --- | --- | --- | --- | --- | | Use case | Use in most cases. | Use when needing to access an 'after' cursor or paginated results within an FQL query. | | Return type | Returns a set. | Returns an object. | | Loading strategy | Lazy loading. Only fetches results as needed. | Eager loading. Fetches results instantly, even if the results aren’t returned or used. | | Client driver methods | Compatible with driver pagination methods. | Incompatible with driver pagination methods. | | Projection | Supports projections. | Doesn’t support projections. | | Set instance methods | Supports set instance methods. | Doesn’t support set instance methods. | ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | size | Int | true | Number of Set values to include in the returned page. Must be in the range 1 to 16000 (inclusive). | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set that includes the following field: FieldTypeDescriptionafterString | NullCursor for the next page of results. The cursor is valid for history_days plus 15 minutes.If no additional pages exist, after is Null. | Field | Type | Description | after | String | Null | Cursor for the next page of results. The cursor is valid for history_days plus 15 minutes.If no additional pages exist, after is Null. | | Field | Type | Description | | after | String | Null | Cursor for the next page of results. The cursor is valid for history_days plus 15 minutes.If no additional pages exist, after is Null. | ## [](#examples)Examples ### [](#basic-example)Basic example ```fql // Calls `pageSize()` with a size of `2`. Product.all().pageSize(2) ``` ``` { // The returned Set contains two elements or fewer. data: [ { id: "111", coll: Product, ts: Time("2099-07-31T12:58:51.680Z"), name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698, stock: 100, category: Category("123") }, { id: "222", coll: Product, ts: Time("2099-07-31T12:58:51.680Z"), name: "donkey pinata", description: "Original Classic Donkey Pinata", price: 2499, stock: 50, category: Category("123") } ], after: "hdW..." } ``` ### [](#paginate-in-reverse)Paginate in reverse Paginated queries don’t include a `before` cursor. Instead, you can use a range search and document IDs or other unique field values to paginate in reverse. For example: 1. Run an initial paginated query: ```fql Product.all().pageSize(2) ``` ``` { data: [ { id: "111", coll: Product, ts: Time("2099-08-16T14:00:59.075Z"), name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698, stock: 100, category: Category("123") }, { id: "222", coll: Product, ts: Time("2099-08-16T14:00:59.075Z"), name: "donkey pinata", description: "Original Classic Donkey Pinata", price: 2499, stock: 50, category: Category("123") } ], after: "hdW..." } ``` 2. Page forward until you find the document you want to start reversing from: ```fql Set.paginate("hdW...") ``` Copy the ID of the document: ``` { data: [ { id: "333", coll: Product, ts: Time("2099-08-16T14:00:59.075Z"), name: "pizza", description: "Frozen Cheese", price: 499, stock: 100, category: Category("456") }, { // Begin reverse pagination from this doc ID. id: "444", coll: Product, ts: Time("2099-08-16T14:00:59.075Z"), name: "avocados", description: "Conventional Hass, 4ct bag", price: 399, stock: 1000, category: Category("789") } ], after: "hdW..." } ``` 3. To reverse paginate, run the original query with: * A range search with a `to` argument containing the previous document ID. * [`set.reverse()`](../reverse/): Append this to the query. * [`set.pageSize()`](./): If used, place it after [`set.reverse()`](../reverse/). ```fql // "444" is the ID of the document to reverse from. Product.all({ to: "444" }).reverse().pageSize(2) ``` ``` { data: [ { // The results of the previous query are reversed. id: "444", coll: Product, ts: Time("2099-08-16T14:00:59.075Z"), name: "avocados", description: "Conventional Hass, 4ct bag", price: 399, stock: 1000, category: Category("789") }, { id: "333", coll: Product, ts: Time("2099-08-16T14:00:59.075Z"), name: "pizza", description: "Frozen Cheese", price: 499, stock: 100, category: Category("456") } ], after: "hdW..." } ``` To get historical snapshots of documents at the time of the original query, use an [`at` expression](../../../fql/statements/#at): ```fql // Time of the original query. let originalQueryTime = Time.fromString("2099-08-16T14:30:00.000Z") at (originalQueryTime) { // "444" is the ID of the document to reverse from. Product.all({ to: "444" }).reverse().pageSize(2) } ``` 4. Repeat the previous step to continue paginating in reverse: ```fql Product.all({ to: "333" }).reverse().pageSize(2) ``` ## [](#see-also)See also * [`set.paginate()`](../paginate/) * [`Set.paginate()`](../static-paginate/) # `set.paginate()` | Learn: Pagination | | --- | --- | --- | Convert a [Set](../../../fql/types/#set) to an [Object](../../../fql/types/#object) with [pagination](../../../../learn/query/pagination/). ## [](#signature)Signature ```fql-sig paginate() => { data: Array, after: String | Null } paginate(size: Number) => { data: Array, after: String | Null } ``` ## [](#description)Description Returns the calling [Set](../../../fql/types/#set) as an [Object](../../../fql/types/#object). If the Set is paginated and a subsequent page is available, the result includes an `after` cursor. To iterate through paginated results, pass the `after` cursor to [`Set.paginate()`](../static-paginate/). `paginate()` accepts an optional size argument to control page size. In most cases, you should not use `paginate()` in place of `pageSize()`. See [Differences with `pageSize()`](#differences-with-pagesize). ### [](#differences-with-pagesize)Differences with `pageSize()` The following table outlines differences between [`set.pageSize()`](../pagesize/) and [`set.paginate()`](./): | Difference | set.pageSize() | set.paginate() | | --- | --- | --- | --- | --- | | Use case | Use in most cases. | Use when needing to access an 'after' cursor or paginated results within an FQL query. | | Return type | Returns a set. | Returns an object. | | Loading strategy | Lazy loading. Only fetches results as needed. | Eager loading. Fetches results instantly, even if the results aren’t returned or used. | | Client driver methods | Compatible with driver pagination methods. | Incompatible with driver pagination methods. | | Projection | Supports projections. | Doesn’t support projections. | | Set instance methods | Supports set instance methods. | Doesn’t support set instance methods. | ### [](#after-cursor)`after` cursor See [Cursor state and expiration](../../../../learn/query/pagination/#cursor-state-expiration). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | size | Int | | Maximum number of Set elements to include in the returned page. The size parameter must be in the range 1 to 16000 (inclusive). | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Object | Object that includes the following fields: FieldTypeDescriptiondataArrayArray representing a page of elements from the calling Set. The number of elements is limited by the size parameter.afterString | NullCursor for the next page of results. The cursor is valid for history_days plus 15 minutes.If no additional pages exist, after is Null. | Field | Type | Description | data | Array | Array representing a page of elements from the calling Set. The number of elements is limited by the size parameter. | after | String | Null | Cursor for the next page of results. The cursor is valid for history_days plus 15 minutes.If no additional pages exist, after is Null. | | Field | Type | Description | | data | Array | Array representing a page of elements from the calling Set. The number of elements is limited by the size parameter. | | after | String | Null | Cursor for the next page of results. The cursor is valid for history_days plus 15 minutes.If no additional pages exist, after is Null. | ## [](#examples)Examples Queries are subject to [size limits](../../../requirements-limits/#glimits). If you’re performing bulk writes on a large dataset, you can use [`set.pageSize()`](../pagesize/) and `paginate()` to perform the write over several queries instead of one. The first query uses `paginate()` to fetch the results and generate an initial `after` cursor: ```fql // Get a Set of `Customer` collection documents with an // `address` in the `state` of `DC`. Use `pageSize()` // and`paginate()` to paginate results and // limit each page to two documents. let page = Customer.where( .address?.state == "DC" ) .pageSize(2).paginate() // `paginate()` returns an object. The object's `data` property // contains an Array of `Customer` documents. let data = page.data // Use `forEach()` to update each `Customer` document in the // `data` Array. data.forEach(doc => doc.update({ address: { state: "District of Columbia" } })) // Project the `after` cursor returned by `paginate()`. // You can use the cursor to iterate through the remaining // pages. page { after } ``` ``` { after: "hdWDxoq..." } ``` Subsequent queries use the cursor and [`Set.paginate()`](../static-paginate/) to iterate through the remaining pages: ```fql // Uses `Set.paginate()` to iterate through pages. let page = Set.paginate("hdWDxoq...") let data = page.data data.forEach(doc => doc.update({ address: { state: "District of Columbia" } })) page { after } ``` ## [](#see-also)See also * [`set.pageSize()`](../pagesize/) * [`Set.paginate()`](../static-paginate/) # `set.reduce()` | Learn: Sets | | --- | --- | --- | Reduce a [Set](../../../fql/types/#set) to a single, accumulated value by applying a provided [function](../../../fql/functions/) to each element. Iterates through elements from left to right. Uses the first element as the initial value. ## [](#signature)Signature ```fql-sig reduce(reducer: ((A, A) => A)) => A | Null ``` ## [](#description)Description Iterates through each element in a [Set](../../../fql/types/#set) to perform a rolling operation. For example, you can use `reduce()` to calculate a rolling sum, concatenate elements, or perform complex transformations. `reduce()` calls a reducer callback function on every element of the Set from left to right. The reducer function takes two arguments: * The accumulator that holds the running result from previous iterations. The first element in the Set serves as the initial accumulator. * The current element’s value. The method returns the result of the last iteration. The calling Set isn’t changed. ### [](#eager-loading)Eager loading This method uses [eager loading](../#eager) and requires a read of each document in the calling Set. For large Sets, this may result in poor performance and high costs. Performance hint: `full_set_read` Queries that call this method on a document Set emit a [performance hint](../../../http/reference/query-summary/#perf), if enabled. For example, the following query: ```fql // Use `reduce()` to sum stock counts let stockCounts = Product.all().map(doc => doc.stock) stockCounts.reduce((a, b) => a + b) ``` Emits the following hint: ``` performance_hint: full_set_read - Using reduce() causes the full set to be read. See https://docs.faunadb.org/performance_hint/full_set_read. at *query*:3:19 | 3 | stockCounts.reduce((a, b) => a + b) | ^^^^^^^^^^^^^^^^^ | ``` To address the hint, use [`set.take()`](../take/) to explicitly limit the size of the calling Set to fewer than 100 documents: ```fql // Limit the doc Set's size using `take()` let stockCounts = Product.all().take(20).map(doc => doc.stock) stockCounts.reduce((a, b) => a + b) ``` This applies even if the original, unbounded Set contains fewer than 100 documents. Alternatively, you can rewrite the query to avoid calling the method. ### [](#avoid-use-on-large-sets)Avoid use on large sets This method scans the full [Set](../../../fql/types/#set), which can cause many reads and might time out for large [Set](../../../fql/types/#set)s. ### [](#fold-family-methods)Fold family methods FQL supports several methods for [folds](https://en.wikipedia.org/wiki/Fold_\(higher-order_function\)), which iteratively reduce a Set to a single value. These methods include: * [`set.fold()`](../fold/) * [`set.foldRight()`](../foldright/) * [`set.reduce()`](./) * [`set.reduceRight()`](../reduceright/) The methods are similar but have the following differences: * [`set.fold()`](../fold/) and [`set.foldRight()`](../foldright/) accept an initial _seed_ value and use it as the initial _accumulator_. [`set.reduce()`](./) and [`set.reduceRight()`](../reduceright/) use the Set’s first element as the initial _accumulator_. * [`set.fold()`](../fold/) and [`set.reduce()`](./) iterate through the Set’s elements from left to right. [`set.foldRight()`](../foldright/) and [`set.reduceRight()`](../reduceright/) iterate through the Set’s elements from right to left. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | reducer | Function | true | Anonymous FQL function to call on each element in the Set. | ### [](#reducer-function-parameters)Reducer function parameters: | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | accumulator | Generic | true | Value returned by the previous reducer function call. On the first call, seed is passed as the accumulator. | | current | Generic | true | The current element’s value. | ## [](#return)Return value One of: | Type | Description | | --- | --- | --- | --- | | Generic | Result of the last reducer function call. | | Null | Returns Null if the calling Set is empty. | ## [](#examples)Examples 1. Starting with the following product prices: ```fql // Gets a Set of the first nine `Product` collection // documents and projects the `price` field. Product.all().take(9) { price } ``` ``` { data: [ { price: 698 }, { price: 2499 }, { price: 499 }, { price: 399 }, { price: 35 }, { price: 349 }, { price: 299 }, { price: 149 }, { price: 2399 } ] } ``` 2. Use `reduce()` to find the maximum price in the Set: ```fql Product.all().take(9).reduce((s, v) => { if (v.price > s.price) { v } else { s } }) {price} ``` ``` { price: 2499 } ``` # `set.reduceRight()` | Learn: Sets | | --- | --- | --- | Reduce a [Set](../../../fql/types/#set) to a single, accumulated value by applying a provided [function](../../../fql/functions/) to each element. Iterates through elements from right to left. Uses the first element as the initial value. ## [](#signature)Signature ```fql-sig reduceRight(reducer: ((A, A) => A)) => A | Null ``` ## [](#description)Description Iterates through each element in a Set to perform a rolling operation. For example, you can use `reduceRight()` to calculate a rolling sum, concatenate elements, or perform complex transformations. `reduceRight()` calls a reducer callback function on every element of the Set from right to left. The reducer function takes two arguments: * The accumulator that holds the running result from previous iterations. The last element in the Set serves as the initial accumulator. * The current element’s value. The method returns the result of the last iteration. The calling Set isn’t changed. ### [](#eager-loading)Eager loading This method uses [eager loading](../#eager) and requires a read of each document in the calling Set. For large Sets, this may result in poor performance and high costs. Performance hint: `full_set_read` Queries that call this method on a document Set emit a [performance hint](../../../http/reference/query-summary/#perf), if enabled. For example, the following query: ```fql // Use `reduceRight()` to sum stock counts let stockCounts = Product.all().map(doc => doc.stock) stockCounts.reduceRight((a, b) => a + b) ``` Emits the following hint: ``` performance_hint: full_set_read - Using reduceRight() causes the full set to be read. See https://docs.faunadb.org/performance_hint/full_set_read. at *query*:3:24 | 3 | stockCounts.reduceRight((a, b) => a + b) | ^^^^^^^^^^^^^^^^^ | ``` To address the hint, use [`set.take()`](../take/) to explicitly limit the size of the calling Set to fewer than 100 documents: ```fql // Limit the doc Set's size using `take()` let stockCounts = Product.all().take(20).map(doc => doc.stock) stockCounts.reduceRight((a, b) => a + b) ``` This applies even if the original, unbounded Set contains fewer than 100 documents. Alternatively, you can rewrite the query to avoid calling the method. ### [](#fold-family-methods)Fold family methods FQL supports several methods for [folds](https://en.wikipedia.org/wiki/Fold_\(higher-order_function\)), which iteratively reduce a Set to a single value. These methods include: * [`set.fold()`](../fold/) * [`set.foldRight()`](../foldright/) * [`set.reduce()`](../reduce/) * [`set.reduceRight()`](./) The methods are similar but have the following differences: * [`set.fold()`](../fold/) and [`set.foldRight()`](../foldright/) accept an initial _seed_ value and use it as the initial _accumulator_. [`set.reduce()`](../reduce/) and [`set.reduceRight()`](./) use the Set’s first element as the initial _accumulator_. * [`set.fold()`](../fold/) and [`set.reduce()`](../reduce/) iterate through the Set’s elements from left to right. [`set.foldRight()`](../foldright/) and [`set.reduceRight()`](./) iterate through the Set’s elements from right to left. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | reducer | Function | true | Anonymous FQL function to call on each element in the Set. | ### [](#reducer-function-parameters)Reducer function parameters: | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | accumulator | Generic | true | Value returned by the previous reducer function call. On the first call, seed is passed as the accumulator. | | current | Generic | true | The current element’s value. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Generic | Result of the last reducer function call. | | Null | Returns Null if the calling Set is empty. | ## [](#examples)Examples Reduce the [Set](../../../fql/types/#set) item, right to left: ```fql // `toSet()` converts an Array to a Set. let set = [1, 2, 3].toSet() set.reduceRight((acc, elem) => acc + elem) ``` ``` 6 ``` # `set.reverse()` | Learn: Sets | | --- | --- | --- | Reverse the order of a [Set](../../../fql/types/#set)'s elements. | Loading strategy: | Lazy loading | | --- | --- | --- | --- | ## [](#signature)Signature ```fql-sig reverse() => Set ``` ## [](#description)Description Reverses the order of the calling [Set](../../../fql/types/#set)'s elements. The calling Set isn’t changed. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set containing reversed elements. | ## [](#examples)Examples ```fql // `toSet()` converts an Array to a Set. let set = [1, 2, 3].toSet() set.reverse() ``` ``` { data: [ 3, 2, 1 ] } ``` # `set.take()` | Learn: Sets | | --- | --- | --- | Get the first _N_ elements of a [Set](../../../fql/types/#set). | Loading strategy: | Lazy loading | | --- | --- | --- | --- | ## [](#signature)Signature ```fql-sig take(limit: Number) => Set ``` ## [](#description)Description Takes the first _N_ elements from the calling [Set](../../../fql/types/#set) and returns them as a new [Set](../../../fql/types/#set). If there are fewer values than _N_ elements in the [Set](../../../fql/types/#set), all elements are returned. The calling Set isn’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | limit | Number | true | Number of elements to return from the start of the Set. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set containing the requested elements. | ## [](#examples)Examples Get the first two documents in the [Set](../../../fql/types/#set) of `Product` documents: ```fql Product.all().take(2) ``` ``` { data: [ { id: "111", coll: Product, ts: Time("2099-10-22T21:56:31.260Z"), name: "cups", description: "Translucent 9 Oz, 100 ct", price: 698, stock: 100, category: Category("123") }, { id: "222", coll: Product, ts: Time("2099-10-22T21:56:31.260Z"), name: "donkey pinata", description: "Original Classic Donkey Pinata", price: 2499, stock: 50, category: Category("123") } ] } ``` # `set.toArray()` | Learn: Sets | | --- | --- | --- | Convert a [Set](../../../fql/types/#set) to an [Array](../../../fql/types/#array). ## [](#signature)Signature ```fql-sig toArray() => Array ``` ## [](#description)Description Converts the calling [Set](../../../fql/types/#set) to an [Array](../../../fql/types/#array). The calling Set isn’t changed. ### [](#eager-loading)Eager loading This method uses [eager loading](../#eager) and requires a read of each document in the calling Set. For large Sets, this may result in poor performance and high costs. Performance hint: `full_set_read` Queries that call this method on a document Set emit a [performance hint](../../../http/reference/query-summary/#perf), if enabled. For example, the following query: ```fql // Use `toArray()` to convert the `Product` collection // to an Array. Product.all().toArray() ``` Emits the following hint: ``` performance_hint: full_set_read - Using toArray() causes the full set to be read. See https://docs.faunadb.org/performance_hint/full_set_read. at *query*:3:22 | 3 | Product.all().toArray() | ^^ | ``` To address the hint, use [`set.take()`](../take/) to explicitly limit the size of the calling Set to fewer than 100 documents: ```fql // Limit the doc Set's size using `take()` Product.all().take(20).toArray() ``` This applies even if the original, unbounded Set contains fewer than 100 documents. Alternatively, you can rewrite the query to avoid calling the method. ### [](#avoid-use-on-large-sets)Avoid use on large sets Because this method scans the full Set, it returns an error if there are more than 16,000 documents in the Set. This method can timeout for large Sets under that limit. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Array representation of the Set instance. | ## [](#examples)Examples ```fql // `toSet()` converts an Array to a Set. let set = [1, 2].toSet() set.toArray() ``` ``` [ 1, 2 ] ``` # `set.toString()` | Learn: Sets | | --- | --- | --- | Return the string `"[set]"`. ## [](#signature)Signature ```fql-sig toString() => String ``` ## [](#description)Description Return the string `"[set]"`. The calling Set isn’t changed. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | "[set]". | ## [](#examples)Examples ```fql // `toSet()` converts an Array to a Set. let set = [].toSet() set.toString() ``` ``` "[set]" ``` # `set.where()` | Learn: Sets | | --- | --- | --- | Get the elements of a [Set](../../../fql/types/#set) that match a provided [predicate](../../../fql/functions/#predicates). | Loading strategy: | Lazy loading | | --- | --- | --- | --- | ## [](#signature)Signature ```fql-sig where(predicate: (A => Boolean | Null)) => Set ``` ## [](#description)Description Returns a [Set](../../../fql/types/#set) of elements from the calling [Set](../../../fql/types/#set) that match a provided [predicate function](../../../fql/functions/#predicates). If `where()` is the last value in a query, the first page of the created [Set](../../../fql/types/#set) is returned. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | predicate | Predicate function | Yes | Anonymous predicate function that:Accepts a Set element as its only argument. Supports shorthand-syntax for objects and documents.Returns a Boolean or NullThe method returns a Set of elements for which the predicate returns true. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set containing elements of the calling Set that match the predicate. If there are no matching elements, the Set is empty. | ## [](#examples)Examples ### [](#basic-example)Basic example ```fql Customer.all().where(.address.state == "DC") ``` ``` { data: [ { id: "111", coll: Customer, ts: Time("2099-10-22T21:56:31.260Z"), cart: Order("412483941752112205"), orders: "hdW...", name: "Alice Appleseed", email: "alice.appleseed@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } }, ... ] } ``` ### [](#filter)Filter covered index values You can use [`set.where()`](./) to filter the results of an [index call](../../../../learn/data-model/indexes/#call). If the [`set.where()`](./) predicate only accesses fields defined in the index definition’s `terms` and `values`, the query is [covered](../../../../learn/data-model/indexes/#covered-queries). For example, given the following index definition: ```fsl collection Product { ... index byName { terms [.name] values [.price, .description] } ... } ``` The following query is covered: ```fql // Covered query. // Calls the `byName()` index. // Uses `where()` to filter the results of // the index call. The predicates only // access covered terms and values. Product.byName("limes") .where(.description.includes("Conventional")) .where(.price < 500) { name, description, price } ``` The following query is uncovered: ```fql Product.byName("limes") .where(.description.includes("Conventional")) // The `where()` predicate accesses the uncovered // `stock` field. .where(.stock < 100) .where(.price < 500) { name, description, price } ``` To cover the query, add the uncovered field to the index definition’s `values`: ```fsl collection Product { ... index byName { terms [.name] // Adds `stock` to the index's values values [.price, .description, .stock] } ... } ``` # String [String](../../fql/types/#string) methods and properties. ## [](#description)Description [String](../../fql/types/#string) methods and properties are provided for formatting and manipulating sequences of characters. ## [](#instance-properties)Instance properties | Property | Description | | --- | --- | --- | --- | | length | Get the length of a String. | ## [](#instance-methods)Instance methods | Method | Description | | --- | --- | --- | --- | | string.at() | Get the character at a specified index of a String. | | string.casefold() | Convert a String to lower case using a specified format. | | string.concat() | Concatenate two Strings. | | string.endsWith() | Test if a String ends with a provided suffix. | | string.includes() | Test if a String includes a provided substring. | | string.includesRegex() | Test if a String contains a substring that matches a provided regular expression. | | string.indexOf() | Get the index of the first matching substring within a String. | | string.indexOfRegex() | Get the index of the first substring matching a provided regular expression within a String. | | string.insert() | Insert a substring into a String at a specified index. | | string.lastIndexOf() | Get the index of the last matching substring within a String. | | string.matches() | Get the substrings in a String that match a provided regular expression. | | string.matchIndexes() | Get the indexes and substrings in a String that match a provided regular expression. | | string.parseDouble() | Convert a String to a Double. | | string.parseInt() | Convert a String to a Int. | | string.parseLong() | Convert a String to a Long. | | string.parseNumber() | Convert a String to a Number. | | string.replace() | Replace a specified number of occurrences of a substring in a String. | | string.replaceAll() | Replace all occurrences of a substring in a String. | | string.replaceAllRegex() | Replace all occurrences of substrings matching a regular expression in a String. | | string.replaceRegex() | Replace a specified number of occurrences of substrings matching a regular expression in a String. | | string.slice() | Get the substring between two indexes of a String. | | string.split() | Split a String at a provided separator. | | string.splitAt() | Split a String at a provided index. | | string.splitRegex() | Split a String using a provided regular expression. | | string.startsWith() | Test if a String starts with a provided prefix. | | string.toLowerCase() | Convert a String to lower case. | | string.toString() | Get a String representation of the value. | | string.toUpperCase() | Convert a String to upper case. | # `string.length` Get a [String](../../../fql/types/#string)'s length. ## [](#signature)Signature ```fql-sig .length: Number ``` ## [](#description)Description Returns the calling [String](../../../fql/types/#string)'s length. The length of an empty [String](../../../fql/types/#string) is `0`. ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Int | Number of characters in the calling String. | ## [](#examples)Examples Get the length of a [String](../../../fql/types/#string): ```fql "HTTP/1.1 200 OK".length ``` ``` 15 ``` # `string.at()` Get the character at a specified index of a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig at(index: Number) => String ``` ## [](#description)Description Returns the UTF-16 character located at a zero-based offset index. The calling [String](../../../fql/types/#string) isn’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | index | Int | true | Zero-based index of the character to return. Must be less than the String's length. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | Character located at the specified index. | ## [](#examples)Examples Return the character at index offset `9`: ```fql "HTTP/1.1 200 OK".at(9) ``` ``` "2" ``` # `string.casefold()` Convert a [String](../../../fql/types/#string) to lower case using a specified format. ## [](#signature)Signature ```fql-sig casefold() => String casefold(format: String) => String ``` ## [](#description)Description Converts the calling [String](../../../fql/types/#string) to lower case. This method is similar to [`string.toLowerCase()`](../tolowercase/) but uses an optionally specified format. The calling [String](../../../fql/types/#string) isn’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | format | String | | Unicode normalization form applied to the converted String: Normalization formDescriptionNFKCCaseFold(Default) Characters are decomposed by compatibility then recomposed by canonical equivalence.NFCCanonical decomposition followed by canonical composition. Characters are decomposed and then recomposed by canonical equivalence.NFDCanonical decomposition. Characters are decomposed by canonical equivalence, and multiple combining characters are arranged in order.NFKCCompatibility decomposition, followed by canonical composition. Characters are decomposed by compatibility then recomposed by canonical equivalence.NFKDCompatibility decomposition. Characters are decomposed by compatibility, and multiple combining characters are arranged in order.See also: Unicode Normalization Forms. | Normalization form | Description | NFKCCaseFold | (Default) Characters are decomposed by compatibility then recomposed by canonical equivalence. | NFC | Canonical decomposition followed by canonical composition. Characters are decomposed and then recomposed by canonical equivalence. | NFD | Canonical decomposition. Characters are decomposed by canonical equivalence, and multiple combining characters are arranged in order. | NFKC | Compatibility decomposition, followed by canonical composition. Characters are decomposed by compatibility then recomposed by canonical equivalence. | NFKD | Compatibility decomposition. Characters are decomposed by compatibility, and multiple combining characters are arranged in order. | | Normalization form | Description | | NFKCCaseFold | (Default) Characters are decomposed by compatibility then recomposed by canonical equivalence. | | NFC | Canonical decomposition followed by canonical composition. Characters are decomposed and then recomposed by canonical equivalence. | | NFD | Canonical decomposition. Characters are decomposed by canonical equivalence, and multiple combining characters are arranged in order. | | NFKC | Compatibility decomposition, followed by canonical composition. Characters are decomposed by compatibility then recomposed by canonical equivalence. | | NFKD | Compatibility decomposition. Characters are decomposed by compatibility, and multiple combining characters are arranged in order. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | Lower case version of the calling String. | ## [](#examples)Examples 1. Convert an email address to lower case using the default NFKC normalization form: ```fql "john.doe@EXAMPLE.COM".casefold() ``` ``` "john.doe@example.com" ``` 2. Convert an email address to lower case using the NFC normalization form: ```fql "john.doe@EXAMPLE.COM".casefold("NFC") ``` ``` "john.doe@EXAMPLE.COM" ``` 3. Default convert Unicode sequence to lower case: ```fql "\uff21\u0030a\u0301".casefold() ``` ``` "a0á" ``` 4. Convert the same Unicode sequence using `NFCK` option: ```fql "\uff21\u0030a\u0301".casefold("NFKC") ``` ``` "A0á" ``` 5. Convert the same Unicode sequence using `NFC` option: ```fql "\uff21\u0030a\u0301".casefold("NFC") ``` ``` "A0á" ``` # `string.concat()` Concatenate two [String](../../../fql/types/#string)s. ## [](#signature)Signature ```fql-sig concat(other: String) => String ``` ## [](#description)Description Concatenates two provided [String](../../../fql/types/#string)s. This method is equivalent to using the `+` operator with [String](../../../fql/types/#string) operands. The input [String](../../../fql/types/#string)s aren’t modified. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | other | String | true | String to concatenate to the calling String. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | The resulting concatenated String. | ## [](#examples)Examples Concatenate the calling [String](../../../fql/types/#string) with the [String](../../../fql/types/#string) "SUCCESS": ```fql "HTTP/1.1 200 OK".concat(" SUCCESS") ``` ``` "HTTP/1.1 200 OK SUCCESS" ``` # `string.endsWith()` Test if a [String](../../../fql/types/#string) ends with a provided suffix. ## [](#signature)Signature ```fql-sig endsWith(suffix: String) => Boolean ``` ## [](#description)Description Tests if the calling [String](../../../fql/types/#string) ends with a provided suffix. An exact match returns `true`. Otherwise, the method returns `false`. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | suffix | String | true | Suffix to compare to the end of the calling String. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the calling String ends with the suffix. Otherwise, false. | ## [](#examples)Examples 1. Test whether a [String](../../../fql/types/#string) ends with `200 OK`: ```fql "HTTP/1.1 200 OK".endsWith("200 OK") ``` ``` true ``` 2. Test whether a [String](../../../fql/types/#string) ends with `200`: ```fql "HTTP/1.1 200 OK".endsWith("200") ``` ``` false ``` # `string.includes()` Test if a [String](../../../fql/types/#string) includes a provided substring. ## [](#signature)Signature ```fql-sig includes(pattern: String) => Boolean ``` ## [](#description)Description Tests if the calling [String](../../../fql/types/#string) contains a provided substring. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pattern | String | true | Substring to search for in this String. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the calling String contains the provided substring. Otherwise, false. | ## [](#examples)Examples 1. Test if the calling string includes the [String](../../../fql/types/#string) `200`: ```fql "HTTP/1.1 200 OK".includes("200") ``` ``` true ``` 2. Test if the calling string includes the [String](../../../fql/types/#string) `400`: ```fql "HTTP/1.1 200 OK".includes("400") ``` ``` false ``` # `string.includesRegex()` Test if a [String](../../../fql/types/#string) contains a substring that matches a provided regular expression. ## [](#signature)Signature ```fql-sig includesRegex(regex: String) => Boolean ``` ## [](#description)Description Tests if the calling [String](../../../fql/types/#string) contains a substring that matches a provided regular expression. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | regex | String | true | Regular expression used to match a substring in the calling String. Supports Java regex. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the calling String contains a substring that matches the provided regular expression. Otherwise, false. | ## [](#examples)Examples ```fql 'foo'.includesRegex('[a-z]') ``` ``` true ``` # `string.indexOf()` Get the index of the first matching substring within a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig indexOf(pattern: String) => Number | Null indexOf(pattern: String, start: Number) => Number | Null ``` ## [](#description)Description Returns the zero-based offset index for the first occurrence of a provided substring within the calling [String](../../../fql/types/#string). Starts at an optional start position in the calling [String](../../../fql/types/#string). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pattern | String | true | Substring to find the first occurrence of within the calling String. | | start | Int | | Zero-based index of the character to start searching for matches. Defaults to 0. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Int | Zero-based index of the first matching occurrence in the calling String. | | Null | No match not found. | ## [](#examples)Examples 1. Get the starting location of the [String](../../../fql/types/#string) `200`, starting from the beginning of the calling string: ```fql "HTTP/1.1 200 OK".indexOf("200", 0) ``` ``` 9 ``` 2. Get the starting location of the [String](../../../fql/types/#string) `200`, starting at a location after the length of the calling string: ```fql "HTTP/1.1 200 OK".indexOf("200", 10) ``` ``` null ``` # `string.indexOfRegex()` Get the index of the first substring matching a provided regular expression within a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig indexOfRegex(regex: String) => Number | Null indexOfRegex(regex: String, start: Number) => Number | Null ``` ## [](#description)Description Returns the zero-based offset index for the first occurrence of a substring matching a provided regular expression within the calling [String](../../../fql/types/#string). Starts at an optional start position in the calling [String](../../../fql/types/#string). == Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | regex | String | true | Regular expression used to match substrings in the calling String. Supports Java regex. | | start | Number | | Zero-based index of the character to start searching for matches. Defaults to 0. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Int | Zero-based index of the first matching occurrence in the calling String. | | Null | No match not found. | ## [](#examples)Examples ```fql 'foo 123'.indexOfRegex('[0-9]') ``` ``` 4 ``` ```fql 'foo 123 abc 5678'.indexOfRegex('[0-9]', 10) ``` ``` 12 ``` # `string.insert()` Insert a substring into a [String](../../../fql/types/#string) at a specified index. ## [](#signature)Signature ```fql-sig insert(index: Number, other: String) => String ``` ## [](#description)Description Inserts a substring into the calling [String](../../../fql/types/#string) at a provided zero-based offset index position. Returns a new [String](../../../fql/types/#string). The calling [String](../../../fql/types/#string) isn’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | index | Number | true | Zero-based index position to insert the substring at. | | insert | String | true | Substring to insert. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | String with the inserted substring. | ## [](#examples)Examples ```fql 'foo'.insert(0, 'bar') ``` ``` "barfoo" ``` # `string.lastIndexOf()` Get the index of the last matching substring within a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig lastIndexOf(pattern: String) => Number | Null lastIndexOf(pattern: String, end: Number) => Number | Null ``` ## [](#description)Description Returns the zero-based offset index for the last occurrence of a provided substring within the calling [String](../../../fql/types/#string). Ends at an optional end position in the calling [String](../../../fql/types/#string). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pattern | String | true | Substring to find the last occurrence of within the calling String. | | end | Int | true | Zero-based index of the character to end searching for matches, counting from left to right. Defaults to the last character of the calling String. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Int | Zero-based index of the last matching occurrence in the calling String, ending at the specified end character. | | Null | No match not found. | ## [](#examples)Examples 1. Get the location of the last occurrence of the [String](../../../fql/types/#string) "200" at or below index location 27: ```fql "HTTP/1.1 200 OK - SUCCESS (200)".lastIndexOf("200", 27) ``` ``` 27 ``` 2. Get the location of the last occurrence of the [String](../../../fql/types/#string) "200" at or below index location 20: ```fql "HTTP/1.1 200 OK - SUCCESS (200)".lastIndexOf("200", 20) ``` ``` 9 ``` 3. Get the location of the last occurrence of the [String](../../../fql/types/#string) "200" at or below index location 8, which fails because the [String](../../../fql/types/#string) isn’t found: ```fql "HTTP/1.1 200 OK - SUCCESS (200)".lastIndexOf("200", 8) ``` ``` null ``` # `string.matches()` Get the substrings in a [String](../../../fql/types/#string) that match a provided regular expression. ## [](#signature)Signature ```fql-sig matches(regex: String) => Array ``` ## [](#description)Description Returns an [Array](../../../fql/types/#array) of substrings in the calling [String](../../../fql/types/#string) that match a provided regular expression. The calling [String](../../../fql/types/#string) isn’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | regex | String | true | Regular expression to find matches for in the calling String. Supports Java regex. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Substrings that match the provided regular expression. | ## [](#examples)Examples ```fql 'foo bar baz'.matches('bar') ``` ``` [ "bar" ] ``` ```fql 'foo bar baz'.matches('[a-z]+') ``` ``` [ "foo", "bar", "baz" ] ``` # `string.matchIndexes()` Get the indexes and substrings in a [String](../../../fql/types/#string) that match a provided regular expression. ## [](#signature)Signature ```fql-sig matchIndexes(regex: String) => Array<[Number, String]> ``` ## [](#description)Description Returns an [Array](../../../fql/types/#array) of substrings and their indexes in the calling [String](../../../fql/types/#string) that match a provided regular expression. The calling [String](../../../fql/types/#string) isn’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | regex | String | true | Regular expression to find matches for in the calling String. Supports Java regex. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Index and substrings that matches the provided regular expression. | ## [](#examples)Examples ```fql 'foobarbaz'.matchIndexes('bar') ``` ``` [ [ 3, "bar" ] ] ``` ```fql 'foo bar baz'.matchIndexes('[a-z]+') ``` ``` [ [ 0, "foo" ], [ 4, "bar" ], [ 8, "baz" ] ] ``` # `string.parseDouble()` Convert a [String](../../../fql/types/#string) to a [Double](../../../fql/types/#double). ## [](#signature)Signature ```fql-sig parseDouble() => Number | Null ``` ## [](#description)Description Converts the calling [String](../../../fql/types/#string) to a numeric [Double](../../../fql/types/#double). The calling [String](../../../fql/types/#string) isn’t changed. ### [](#limitations)Limitations * Leading and trailing whitespace are allowed. * Exponential notation is allowed. * Comma separators aren’t allowed. ## [](#parameters)Parameters None ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Number | Double parsed from the calling string. | | Null | Unable to parse calling String. | ## [](#examples)Examples 1. Convert a simple numeric value: ```fql "2147483647".parseDouble() ``` ``` 2.147483647E9 ``` 2. Convert a value represented in exponential notation: ```fql "1.7976931348623158e308".parseDouble() ``` ``` 1.7976931348623157e+308 ``` # `string.parseInt()` Convert a [String](../../../fql/types/#string) to a [Int](../../../fql/types/#int). ## [](#signature)Signature ```fql-sig parseInt() => Number | Null parseInt(radix: Number) => Number | Null ``` ## [](#description)Description Converts the calling [String](../../../fql/types/#string) to a numeric [Int](../../../fql/types/#int). The calling [String](../../../fql/types/#string) isn’t changed. ### [](#limitations)Limitations * Leading and trailing whitespace and comma separators result in an error. * Exponential notation isn’t allowed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | radix | Number | | Value between 2 and 36 that represents the radix of the calling String. Default = 10. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Number | Int parsed from the calling String. | | Null | Unable to parse calling String. | ## [](#examples)Examples Convert a value using the default, base 10 radix: ```fql "2147483647".parseInt() ``` ``` 2147483647 ``` # `string.parseLong()` Convert a [String](../../../fql/types/#string) to a [Long](../../../fql/types/#long). ## [](#signature)Signature ```fql-sig parseLong() => Number | Null parseLong(radix: Number) => Number | Null ``` ## [](#description)Description Converts the calling [String](../../../fql/types/#string) to a numeric [Long](../../../fql/types/#long). The calling [String](../../../fql/types/#string) isn’t changed. ### [](#limitations)Limitations * Leading and trailing whitespace and comma separators result in an error. * Exponential notation isn’t allowed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | radix | Number | | Value between 2 and 36 that represents the radix of the calling String. Default = 10. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Number | Long parsed from the calling String. | | Null | Unable to parse calling String. | ## [](#examples)Examples Convert an email address to upper case: ```fql "10".parseLong() ``` ``` 10 ``` # `string.parseNumber()` Convert a [String](../../../fql/types/#string) to a [Number](../../../fql/types/#number). ## [](#signature)Signature ```fql-sig parseNumber() => Number | Null ``` ## [](#description)Description Converts the calling [String](../../../fql/types/#string) to a numeric [Number](../../../fql/types/#number). The method attempts to parse the calling [String](../../../fql/types/#string) in the following order: 1. [Int](../../../fql/types/#int) 2. [Long](../../../fql/types/#long) 3. [Double](../../../fql/types/#double) The calling [String](../../../fql/types/#string) isn’t changed. ### [](#limitations)Limitations * Leading and trailing whitespace are allowed. * Exponential notation is allowed. * Comma separators aren’t allowed. ## [](#parameters)Parameters None ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Number | Value parsed from the calling String. | | Null | Unable to parse calling String. | ## [](#examples)Examples 1. Convert an exponential value: ```fql "1.7976931348623158e308".parseNumber() ``` ``` 1.7976931348623157e+308 ``` 2. Convert the [String](../../../fql/types/#string) to a number larger than [Number](../../../fql/types/#number) can hold: ```fql "1.7976931348623159e308".parseNumber() ``` ``` Math.Infinity ``` # `string.replace()` Replace a specified number of occurrences of a substring in a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig replace(pattern: String, replacement: String) => String replace(pattern: String, replacement: String, amount: Number) => String ``` ## [](#description)Description Replaces the occurrences of a provided substring in the calling [String](../../../fql/types/#string) with a provided replacement for a specified number of times. Returns a new [String](../../../fql/types/#string). The calling [String](../../../fql/types/#string) isn’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pattern | String | true | Substring to match in the calling String. | | replacement | String | true | Replacement for matching substrings in the calling String. | | amount | Number | | Number of replacements to make in the calling String. Defaults to 1. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | Resulting String with replacements. | ## [](#examples)Examples ```fql 'foobar'.replace('foo', 'bar') ``` ``` "barbar" ``` # `string.replaceAll()` Replace all occurrences of a substring in a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig replaceAll(pattern: String, replacement: String) => String ``` ## [](#description)Description Replaces all occurrences of a provided substring in the calling [String](../../../fql/types/#string) with a provided replacement. Returns a new [String](../../../fql/types/#string). The calling [String](../../../fql/types/#string) isn’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pattern | String | true | Substring to match in the calling String. | | replacement | String | true | Replacement for matching substrings in the calling String. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | Resulting String with replacements. | ## [](#examples)Examples ```fql 'foobar'.replaceAll('foo', 'bar') ``` ``` "barbar" ``` # `string.replaceAllRegex()` Replace all occurrences of substrings matching a regular expression in a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig replaceAllRegex(pattern: String, replacement: String) => String ``` ## [](#description)Description Replaces all occurrences of substrings matching a regular expression in the calling [String](../../../fql/types/#string) with a provided replacement. Returns a new [String](../../../fql/types/#string). The calling [String](../../../fql/types/#string) isn’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pattern | String | true | Regular expression to match in the calling String. Supports Java regex. | | replacement | String | true | Replacement for matches in the calling String. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | Resulting String with replacements. | ## [](#examples)Examples ```fql "1234".replaceAllRegex('\\w', 'abc-') ``` ``` "abc-abc-abc-abc-" ``` # `string.replaceRegex()` Replace a specified number of occurrences of substrings matching a regular expression in a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig replaceRegex(regex: String, replacement: String) => String replaceRegex(regex: String, replacement: String, amount: Number) => String ``` ## [](#description)Description Replaces the occurrences of substrings matching a regular expression in the calling [String](../../../fql/types/#string) with a provided replacement for a specified number of times. Returns a new [String](../../../fql/types/#string). The calling [String](../../../fql/types/#string) isn’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | regex | String | true | Regular expression to match in the calling String. Supports Java regex. | | replacement | String | true | Replacement for matching substrings in the calling String. | | amount | Number | | Number of replacements to make in the calling String. Defaults to 1. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | Resulting String with replacements. | ## [](#examples)Examples ```fql 'foo bar'.replaceRegex('\\w', 'z', 2) ``` ``` "zzo bar" ``` # `string.slice()` Get the substring between two indexes of a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig slice(start: Number) => String slice(start: Number, end: Number) => String ``` ## [](#description)Description Extracts the substring between provided two zero-based offset indexes of the calling [String](../../../fql/types/#string). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | start | Int | true | Starting index of the substring to extract (inclusive). The index is a zero-based offset, counted from the left. If this start index is greater than or equal to the length of the String, the method returns an empty String. | | end | Int | true | Ending index of the substring to extract (exclusive). The index is a zero-based offset, counted from the left. If this end index is less than the start index, the method returns an empty String. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | Substring extracted from the calling String. | ## [](#examples)Examples Get the sub[String](../../../fql/types/#string) from index 9 up to, but not including, 15: ```fql "HTTP/1.1 200 OK".slice(9, 15) ``` ``` "200 OK" ``` # `string.split()` Split a [String](../../../fql/types/#string) at a provided separator. ## [](#signature)Signature ```fql-sig split(separator: String) => Array ``` ## [](#description)Description Splits the calling [String](../../../fql/types/#string) at every occurrence of a provided separator. Returns an array of the resulting [String](../../../fql/types/#string)s. The separator isn’t preserved in the results. The calling [String](../../../fql/types/#string) isn’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | separator | String | true | Separator to split the calling string[] at. Splits at every occurrence of the separator. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Array of Strings resulting from the split. The separator isn’t preserved in the results. | ## [](#examples)Examples ```fql 'foobarbaz'.split('b') ``` ``` [ "foo", "ar", "az" ] ``` # `string.splitAt()` Split a [String](../../../fql/types/#string) at a provided index. ## [](#signature)Signature ```fql-sig splitAt(index: Number) => [String, String] ``` ## [](#description)Description Splits the calling [String](../../../fql/types/#string) at a zero-based offset index. The calling [String](../../../fql/types/#string) isn’t changed. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | index | Number | true | Zero-based offset index to split the calling String at. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Array of Strings resulting from the split. | ## [](#examples)Examples ```fql 'foobar'.splitAt(3) ``` ``` [ "foo", "bar" ] ``` # `string.splitRegex()` Split a [String](../../../fql/types/#string) using a provided regular expression. ## [](#signature)Signature ```fql-sig splitRegex(regex: String) => Array ``` ## [](#description)Description Splits the calling [String](../../../fql/types/#string) at every occurrence of a provided regular expression, used as a delimiter string. The calling [String](../../../fql/types/#string) isn’t changed. Multiple, adjacent delimiters in the calling [String](../../../fql/types/#string) result in an empty element in the resulting [Array](../../../fql/types/#array). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | regex | String | true | Regular expression to split the calling string[] at. Splits at every matching substring. Supports Java regex. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Array | Array of Strings resulting from the split. | ## [](#examples)Examples ```fql 'foo bar baz'.splitRegex('\\W+') ``` ``` [ "foo", "bar", "baz" ] ``` # `string.startsWith()` Test if a [String](../../../fql/types/#string) starts with a provided prefix. ## [](#signature)Signature ```fql-sig startsWith(prefix: String) => Boolean ``` ## [](#description)Description Tests if the calling [String](../../../fql/types/#string) starts with a provided prefix. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | prefix | String | true | Prefix to compare to the end of the calling String. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the calling String ends with the prefix. Otherwise, false. | ## [](#examples)Examples 1. Test whether a [String](../../../fql/types/#string) starts with `HTTP`: ```fql "HTTP/1.1 200 OK".startsWith("HTTP") ``` ``` true ``` 2. Test whether a [String](../../../fql/types/#string) starts with `200`: ```fql "HTTP/1.1 200 OK".startsWith("200") ``` ``` false ``` # `string.toLowerCase()` Convert a [String](../../../fql/types/#string) to lower case. ## [](#signature)Signature ```fql-sig toLowerCase() => String ``` ## [](#description)Description Converts the calling [String](../../../fql/types/#string) to lower case. The calling [String](../../../fql/types/#string) isn’t changed. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | Lower case version of the calling String. | ## [](#examples)Examples Convert an email address to lower case: ```fql "john.doe@EXAMPLE.COM".toLowerCase() ``` ``` "john.doe@example.com" ``` # `string.toString()` Get a [String](../../../fql/types/#string) representation of the value. ## [](#signature)Signature ```fql-sig toString() => String ``` ## [](#description)Description Returns a [String](../../../fql/types/#string) representation of the calling value. If the calling value is a [String](../../../fql/types/#string), it returns the calling [String](../../../fql/types/#string). The calling value isn’t changed. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | String representation of the calling String. | ## [](#examples)Examples ### [](#pass-a-string)Pass a String Confirm that the calling string type returns itself: ```fql 'foobar'.toString() ``` ``` "foobar" ``` ### [](#pass-a-time-value)Pass a Time value Get the [String](../../../fql/types/#string) representation of a [`Time()`](../../time/time/) value: ```fql let t1 = Time.fromString("2099-10-20T21:15:09.890729Z") t1.toString() ``` ``` "2099-10-20T21:15:09.890729Z" ``` ### [](#pass-a-number)Pass a Number Get the [String](../../../fql/types/#string) representation of a floating point number: ```fql 1.5.toString() ``` ``` "1.5" ``` ### [](#pass-a-null-value)Pass a Null value If passed [Null](../../../fql/types/#null), `toString()` returns a `"null"` string representation. For example: ```fql {a: null}.a?.toString() ``` ``` "null" ``` # `string.toUpperCase()` Convert a [String](../../../fql/types/#string) to upper case. ## [](#signature)Signature ```fql-sig toUpperCase() => String ``` ## [](#description)Description Converts the calling [String](../../../fql/types/#string) to upper case. The calling [String](../../../fql/types/#string) isn’t changed. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | Upper case version of the calling String. | ## [](#examples)Examples Convert an email address to upper case: ```fql "john.doe@example.com".toUpperCase() ``` ``` "JOHN.DOE@EXAMPLE.COM" ``` # Time [Time](../../fql/types/#time) methods and properties.. ## [](#description)Description [Time](../../fql/types/#time) methods are provided to represent an instantaneous point in time. ## [](#instance-properties)Instance properties | Method | Description | | --- | --- | --- | --- | | dayOfMonth | Get the day of the month from a Time. | | dayOfWeek | Get the day of the week from a Time. | | dayOfYear | Get the day of the year from a Time. | | hour | Get the hour of a Time. | | minute | Get the minute of a Time. | | month | Get the month of a Time. | | second | Get the second of a Time. | | year | Get the year of a Time. | ## [](#static-methods)Static methods | Method | Description | | --- | --- | --- | --- | | Time() | Construct a Time from an ISO 8601 timestamp String. | | Time.epoch() | Convert a Unix epoch timestamp to a Time. | | Time.fromString() | Construct a Time from an ISO 8601 timestamp String. | | Time.now() | Get the current UTC Time. | ## [](#instance-methods)Instance methods | Method | Description | | --- | --- | --- | --- | | time.add() | Add a time interval to a Time. | | time.difference() | Get the difference between two Times. | | time.subtract() | Subtract a time interval from a Time. | | time.toMicros() | Convert a Time to a Unix epoch timestamp in microseconds. | | time.toMillis() | Convert a Time to a Unix epoch timestamp in milliseconds. | | time.toSeconds() | Convert a Time to a Unix epoch timestamp in seconds. | | time.toString() | Convert a Time to a String. | # `dayOfMonth` Get the day of the month from a [Time](../../../fql/types/#time). ## [](#signature)Signature ```fql-sig dayOfMonth: Number ``` ## [](#description)Description Get the day-of-month. ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Day-of-month field of the Time instance. | ## [](#examples)Examples ```fql Time('2099-02-10T12:01:19.000Z').dayOfMonth ``` ``` 10 ``` # `dayOfWeek` Get the day of the week from a [Time](../../../fql/types/#time). ## [](#signature)Signature ```fql-sig dayOfWeek: Number ``` ## [](#description)Description Get the day-of-week. ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Day-of-week field of the Time instance:1 = Monday2 = Tuesday3 = Wednesday4 = Thursday5 = Friday6 = Saturday7 = Sunday | ## [](#examples)Examples ```fql Time('2099-02-10T12:01:19.000Z').dayOfWeek ``` ``` 2 ``` # `dayOfYear` Get the day of the year from a [Time](../../../fql/types/#time). ## [](#signature)Signature ```fql-sig dayOfYear: Number ``` ## [](#description)Description Get the day-of-year. ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Day-of-year field of Time instance. | ## [](#examples)Examples ```fql Time('2099-02-10T12:01:19.000Z').dayOfYear ``` ``` 41 ``` # `hour` Get the hour of a [Time](../../../fql/types/#time). ## [](#signature)Signature ```fql-sig hour: Number ``` ## [](#description)Description Get the [Time](../../../fql/types/#time) hour field. ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Hour field of Time instance. | ## [](#examples)Examples ```fql Time('2099-02-10T12:01:19.000Z').hour ``` ``` 12 ``` # `minute` Get the minute of a [Time](../../../fql/types/#time). ## [](#signature)Signature ```fql-sig minute: Number ``` ## [](#description)Description Get the [Time](../../../fql/types/#time) minute field. ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Minutes field of the Time instance. | ## [](#examples)Examples ```fql Time('2099-02-10T12:01:19.000Z').minute ``` ``` 1 ``` # `month` Get the month of a [Time](../../../fql/types/#time). ## [](#signature)Signature ```fql-sig month: Number ``` ## [](#description)Description Get the [Time](../../../fql/types/#time) month field. ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | The month. | ## [](#examples)Examples ```fql Time('2099-02-10T12:01:19.000Z').month ``` ``` 2 ``` # `second` Get the second of a [Time](../../../fql/types/#time). ## [](#signature)Signature ```fql-sig second: Number ``` ## [](#description)Description Get the [Time](../../../fql/types/#time) second field. ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Seconds field of Time instance. | ## [](#examples)Examples ```fql Time('2099-02-10T12:01:19.000Z').second ``` ``` 19 ``` # `year` Get the year of a [Time](../../../fql/types/#time). ## [](#signature)Signature ```fql-sig year: Number ``` ## [](#description)Description Get the [Time](../../../fql/types/#time) year field. ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | The year. | ## [](#examples)Examples ```fql Time('2099-02-10T12:01:19.000Z').year ``` ``` 2099 ``` # `Time()` Construct a [Time](../../../fql/types/#time) from an ISO 8601 timestamp [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig Time(time: String) => Time ``` ## [](#description)Description Converts an ISO 8601 timestamp [String](../../../fql/types/#string) to a [Time](../../../fql/types/#time) value. Parameter fields: | Time field | Description | | --- | --- | --- | --- | | yyyy | Four-digit year. | | MM | Month, from 01 to 12. | | dd | Day, from 01 to 31. | | T | Date and time separator. | | hh | Hours, from 00 to 23. | | mm | Minutes, from 00 to 59. | | ss | Seconds, from 00 to 59, which can also be expressed as a decimal fraction to give nanosecond resolution. | | TZO | Timezone offset from UTC which can be one of: TimezoneDescriptionZUTC, no offset+hhmmPositive hour and minute offset from UTC.-hhmmNegative hour and minute offset from UTC. | Timezone | Description | Z | UTC, no offset | +hhmm | Positive hour and minute offset from UTC. | -hhmm | Negative hour and minute offset from UTC. | | Timezone | Description | | Z | UTC, no offset | | +hhmm | Positive hour and minute offset from UTC. | | -hhmm | Negative hour and minute offset from UTC. | This method is equivalent to [`Time.fromString()`](../fromstring/). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | time | String | true | Timestamp string in the form yyyy-MM-ddThh:mm:ssTZO. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Time | Time representation of the timestamp string. | ## [](#examples)Examples Convert a time [String](../../../fql/types/#string) to a [Time](../../../fql/types/#time) value: ```fql Time("2099-10-20T21:15:09.890729Z") ``` ``` Time("2099-10-20T21:15:09.890729Z") ``` Test if a document timestamp is equal to a given time: ```fql Customer.where(.ts == Time("2099-06-25T20:23:49.070Z")) ``` ``` { data: [ { id: "111", coll: Customer, ts: Time("2099-06-25T20:23:49.070Z"), cart: Order("412483941752112205"), orders: "hdW...", name: "Alice Appleseed", email: "alice.appleseed@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } }, ... ] } ``` # `Time.epoch()` Convert a Unix epoch timestamp to a [Time](../../../fql/types/#time). ## [](#signature)Signature ```fql-sig Time.epoch(offset: Number, unit: String) => Time ``` ## [](#description)Description Converts a [Number](../../../fql/types/#number) representing a Unix epoch timestamp to a [Time](../../../fql/types/#time) value. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | offset | Number | true | Offset from the Unix epoch. | | unit | String | true | Time unit used to measure the offset from the Unix epoch.nanosecondsmicrosecondsmillisecondssecondsminuteshoursdays | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Time | Resulting Time, rounded to the nearest nanosecond. | ## [](#examples)Examples ```fql Time.epoch(1676030400, 'seconds') ``` ``` Time("2023-02-10T12:00:00Z") ``` # `Time.fromString()` Construct a [Time](../../../fql/types/#time) from an ISO 8601 timestamp [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig Time.fromString(time: String) => Time ``` ## [](#description)Description Converts an ISO 8601 timestamp [String](../../../fql/types/#string) to a [Time](../../../fql/types/#time) value. Parameter fields: | Time field | Description | | --- | --- | --- | --- | | yyyy | Four-digit year. | | MM | Month, from 01 to 12. | | dd | Day, from 01 to 31. | | T | Date and time separator. | | hh | Hours, from 00 to 23. | | mm | Minutes, from 00 to 59. | | ss | Seconds, from 00 to 59, which can also be expressed as a decimal fraction to give nanosecond resolution. | | TZO | Timezone offset from UTC which can be one of: TimezoneDescriptionZUTC, no offset+hhmmPositive hour and minute offset from UTC.-hhmmNegative hour and minute offset from UTC. | Timezone | Description | Z | UTC, no offset | +hhmm | Positive hour and minute offset from UTC. | -hhmm | Negative hour and minute offset from UTC. | | Timezone | Description | | Z | UTC, no offset | | +hhmm | Positive hour and minute offset from UTC. | | -hhmm | Negative hour and minute offset from UTC. | This method is equivalent to [`Time()`](../time/). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | time | String | true | Timestamp string in the form yyyy-MM-ddThh:mm:ssTZO. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Time | Time representation of the timestamp string. | ## [](#examples)Examples Convert a time [String](../../../fql/types/#string) to a [Time](../../../fql/types/#time) value: ```fql Time.fromString("2099-10-20T21:15:09.890729Z") ``` ``` Time("2099-10-20T21:15:09.890729Z") ``` # `Time.now()` Get the current UTC [Time](../../../fql/types/#time). ## [](#signature)Signature ```fql-sig Time.now() => Time ``` ## [](#description)Description The `Time.now()` method gets the start time of the current query. Calling `Time.now()` multiple times in a query returns the same value each time. The [Time](../../../fql/types/#time) object returned is in [UTC](https://en.wikipedia.org/wiki/Coordinated_Universal_Time) and has nanosecond resolution. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Time | ISO 8601 time value representing the start time of the current query. | ## [](#examples)Examples Get the query start time: ```fql Time.now() ``` ``` Time("2099-10-07T14:43:33.469Z") ``` # `time.add()` Add a time interval to a [Time](../../../fql/types/#time). ## [](#signature)Signature ```fql-sig add(amount: Number, unit: String) => Time ``` ## [](#description)Description Add a time interval to [Time](../../../fql/types/#time). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | amount | Number | true | Number of units to add to the given time. | | unit | String | true | Unit for the operation. Accepts one of the following:nanosecondsmicrosecondsmillisecondssecondsminuteshoursdays | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Time | New time with the added time interval, rounded to the nearest nanosecond. | ## [](#examples)Examples ```fql Time('2099-02-10T12:00:00.000Z').add(19, 'minutes') ``` ``` Time("2099-02-10T12:19:00Z") ``` # `time.difference()` Get the difference between two [Time](../../../fql/types/#time)s. ## [](#signature)Signature ```fql-sig difference(start: Time, unit: String) => Number ``` ## [](#description)Description Get the difference between two times. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | start | Time | true | Time to subtract from instance Time. | | unit | String | true | Time units:nanosecondsmicrosecondsmillisecondssecondsminuteshoursdays | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Difference between the provided times, in units rounded to the nearest nanosecond. | ## [](#examples)Examples ```fql Time('2099-02-10T12:00:00.000Z').difference(Time('2099-02-01T12:00:00.000Z'), 'days') ``` ``` 9 ``` # `time.subtract()` Subtract a time interval from a [Time](../../../fql/types/#time). ## [](#signature)Signature ```fql-sig subtract(amount: Number, unit: String) => Time ``` ## [](#description)Description Subtract a time interval from [Time](../../../fql/types/#time). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | amount | Number | true | Count of number of units. | | unit | String | true | Time units:nanosecondsmicrosecondsmillisecondssecondsminuteshoursdays | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Time | Resulting time, rounded to the nearest nanosecond. | ## [](#examples)Examples ```fql Time('2099-02-10T12:00:00.000Z').add(19, 'minutes') ``` ``` Time("2099-02-10T12:19:00Z") ``` # `time.toMicros()` Convert a [Time](../../../fql/types/#time) to a Unix epoch timestamp in microseconds. ## [](#signature)Signature ```fql-sig toMicros() => Number ``` ## [](#description)Description Convert a [Time](../../../fql/types/#time) to a Unix epoch timestamp in microseconds. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Unix epoch timestamp in microseconds. | ## [](#examples)Examples ```fql Time("2099-02-10T12:01:19.000Z").toMicros() ``` ``` 4074408079000000 ``` # `time.toMillis()` Convert a [Time](../../../fql/types/#time) to a Unix epoch timestamp in milliseconds. ## [](#signature)Signature ```fql-sig toMillis() => Number ``` ## [](#description)Description Convert a [Time](../../../fql/types/#time) to a Unix epoch timestamp in milliseconds. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Unix epoch timestamp in milliseconds. | ## [](#examples)Examples ```fql Time("2099-02-10T12:01:19.000Z").toMillis() ``` ``` 4074408079000 ``` # `time.toSeconds()` Convert a [Time](../../../fql/types/#time) to a Unix epoch timestamp in seconds. ## [](#signature)Signature ```fql-sig toSeconds() => Number ``` ## [](#description)Description Convert a [Time](../../../fql/types/#time) to a Unix epoch timestamp in seconds. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Number | Unix epoch timestamp in seconds. | ## [](#examples)Examples ```fql Time("2099-02-10T12:01:19.000Z").toSeconds() ``` ``` 4074408079 ``` # `time.toString()` Convert a [Time](../../../fql/types/#time) to a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig toString() => String ``` ## [](#description)Description Converts the calling [Time](../../../fql/types/#time) to an ISO 8601 [String](../../../fql/types/#string). ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | String representation of the calling Time. | ## [](#examples)Examples ```fql let t = Time("2099-10-20T21:15:09.890729Z") t.toString() ``` ``` "2099-10-20T21:15:09.890729Z" ``` # Token | Learn: Tokens | | --- | --- | --- | A [token](../../../learn/security/tokens/) is a type of [authentication secret](../../../learn/security/authentication/#secrets) used to provide identity-based access to a Fauna database. You typically create and use tokens as part of a Fauna-based [end-user authentication system](../../../build/tutorials/auth/). ## [](#collection)`Token` collection Fauna stores tokens as documents in the `Token` system collection. `Token` documents have the following FQL structure: ``` { id: "401671202234433613", coll: Token, ts: Time("2099-06-25T13:32:39.240Z"), ttl: Time("2099-06-27T13:32:39.240Z"), document: Customer("401670531121676365"), secret: "fn..." } ``` | Field name | Value type | Read-only | Required | Description | | --- | --- | --- | --- | --- | --- | --- | | id | ID | | | ID for the Token document. The ID is a string-encoded, 64-bit unsigned integer in the 253-1 range. The ID is unique within the collection.IDs are assigned at document creation. To create a token with a user-provided id using Token.create(), you must use a secret with the create_with_id privilege for the Token collection. If not provided, Fauna generates the id. | | coll | Collection | true | | Collection name: Token. | | ts | Time | true | | Last time the document was created or updated. | | ttl | Time | | | Time-to-live (TTL) for the document. Only present if set. If not present or set to null, the document persists indefinitely. | | document | Ref | | true | The identity document associated with the token. | | secret | String | | | Randomly generated cryptographic hash. Equivalent to a password. | | data | Object | | | Arbitrary user-defined metadata for the document. | ## [](#static-methods)Static methods You can use the following static methods to manage the `Token` collection in FQL. | Method | Description | | --- | --- | --- | --- | | Token.all() | Get a Set of all tokens. | | Token.byDocument() | Get a token by its identity document. | | Token.byId() | Get a token by its document id. | | Token.create() | Create a token without a credential or related password. | | Token.firstWhere() | Get the first token that matches a provided predicate. | | Token.toString() | Get "Token" as a String. | | Token.where() | Get a Set of tokens that match a provided predicate. | ## [](#instance-methods)Instance methods You can use the following instance methods to manage specific `Token` documents in FQL. | Method | Description | | --- | --- | --- | --- | | token.delete() | Delete a token. | | token.exists() | Test if a token exists. | | token.replace() | Replace a token. | | token.update() | Update a token. | # `Token.all()` | Learn: Tokens | | --- | --- | --- | Get a Set of all [tokens](../../../../learn/security/tokens/). ## [](#signature)Signature ```fql-sig Token.all() => Set Token.all(range: { from: Any } | { to: Any } | { from: Any, to: Any }) => Set ``` ## [](#description)Description Gets a Set containing all [tokens](../../../../learn/security/tokens/), represented as [`Token` documents](../), for the database. To limit the returned Set, you can provide an optional range. If this method is the last expression in a query, the first page of the Set is returned. See [Pagination](../../../../learn/query/pagination/). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | range | { from: Any } | { to: Any } | { from: Any, to: Any } | | Specifies a range of Token documents in the form { from: start, to: end }. from and to arguments should be in the order returned by an unbounded Token.all() call. See Range examples.The Set only includes documents in this range (inclusive). Omit from or to to run unbounded range searches.If a range is omitted, all tokens are returned. | ### [](#range-parameters)Range parameters | Name | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | from | Any | | Beginning of the range (inclusive). Must be an Token document. | | to | Any | | End of the range (inclusive). Must be an Token document. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set of Token documents in the provided range. If a range is omitted, all tokens are returned.The Set is empty if:The database has no tokens.There are no tokens in the provided range.The provided range’s from value is greater than to. | ## [](#examples)Examples ### [](#range)Range examples 1. Get all tokens for the database: ```fql Token.all() ``` ``` { data: [ { id: "123", coll: Token, ts: Time("2099-06-25T13:32:39.240Z"), document: Customer("111") }, { id: "456", coll: Token, ts: Time("2099-06-25T13:32:39.240Z"), document: Customer("222") }, { id: "789", coll: Token, ts: Time("2099-06-25T13:32:39.240Z"), document: Customer("333") } ] } ``` 2. Given the previous Set, get all tokens starting with ID `456` (inclusive): ```fql Token.all({ from: Token.byId("456") }) ``` ``` { data: [ { id: "456", coll: Token, ts: Time("2099-06-25T13:32:39.240Z"), document: Customer("222") }, { id: "789", coll: Token, ts: Time("2099-06-25T13:32:39.240Z"), document: Customer("333") } ] } ``` 3. Get a Set of tokens from ID `456` (inclusive) to `789` (inclusive): ```fql Token.all({ from: Token.byId("456"), to: Token.byId("789") }) ``` ``` { data: [ { id: "456", coll: Token, ts: Time("2099-06-25T13:32:39.240Z"), document: Customer("222") }, { id: "789", coll: Token, ts: Time("2099-06-25T13:32:39.240Z"), document: Customer("333") } ] } ``` 4. Get a Set of tokens up to ID `789` (inclusive): ```fql Token.all({ to: Token.byId("456") }) ``` ``` { data: [ { id: "123", coll: Token, ts: Time("2099-06-25T13:32:39.240Z"), document: Customer("111") }, { id: "456", coll: Token, ts: Time("2099-06-25T13:32:39.240Z"), document: Customer("222") } ] } ``` # `Token.byDocument()` | Learn: Tokens | | --- | --- | --- | Get a [token](../../../../learn/security/tokens/) by its [identity document](../../../../learn/security/tokens/#identity-document). ## [](#signature)Signature ```fql-sig Token.byDocument(document: { *: Any }) => Set ``` ## [](#description)Description Gets a [token](../../../../learn/security/tokens/), represented as a [`Token` document](../), by its [identity document](../../../../learn/security/tokens/#identity-document). An identity document can have multiple tokens. A token is a type of [authentication secret](../../../../learn/security/authentication/#secrets) used to provide identity-based access to a Fauna database. Fauna stores tokens as documents in the [`Token` system collection](../). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | document | Object | true | Identity document for the token to retrieve. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Resolved reference to a Token document. | ## [](#examples)Examples ```fql Token.byDocument(Customer.byId("111")) ``` ``` { data: [ { id: "371233004820889634", coll: Token, ts: Time("2099-07-25T14:10:32.165Z"), document: Customer("111") }, ... ] } ``` # `Token.byId()` | Learn: Tokens | | --- | --- | --- | Get a [token](../../../../learn/security/tokens/) by its [document `id`](../../../../learn/data-model/documents/#meta). ## [](#signature)Signature ```fql-sig Token.byId(id: ID) => Ref ``` ## [](#description)Description Gets a [token](../../../../learn/security/tokens/), represented as an [`Token` document](../), by its [document `id`](../../../../learn/data-model/documents/#meta). A token is a type of [authentication secret](../../../../learn/security/authentication/#secrets) used to provide identity-based access to a Fauna database. Fauna stores tokens as documents in the [`Token` system collection](../). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | id | String | true | ID of the Token document to retrieve. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Ref | Resolved reference to the Token document. Can resolve to an existing document or a NullDoc. | ## [](#examples)Examples ```fql Token.byId("371233004820889634") ``` ``` { id: "371233004820889634", coll: Token, ts: Time("2099-07-25T14:10:32.165Z"), document: Customer("111") } ``` # `Token.create()` | Learn: Tokens | | --- | --- | --- | Create a [token](../../../../learn/security/tokens/) without a credential or related password. ## [](#signature)Signature ```fql-sig Token.create(data: { id: ID | Null, document: { *: Any } | Null, ttl: Time | Null, data: { *: Any } | Null }) => Token ``` ## [](#description)Description Creates a [token](../../../../learn/security/tokens/) that’s tied to an [identity document](../../../../learn/security/tokens/#identity-document) without a credential or related password. This method is useful for creating tokens for servers, services, and other non-user identities. A token is a type of [authentication secret](../../../../learn/security/authentication/#secrets) used to provide identity-based access to a Fauna database. Fauna stores tokens as documents in the [`Token` system collection](../). ### [](#create-token-with-a-credential)Create token with a credential To create a token with a credential and related password, use [`credential.login()`](../../credential/login/) instead. You typically use [`credential.login()`](../../credential/login/) to create and use tokens as part of a Fauna-based [end-user authentication system](../../../../build/tutorials/auth/). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | | Document fields for the new Token document.For supported document fields, see Token collection. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Token | The new Token document. | ## [](#examples)Examples ```fql Token.create({ document: Customer.byId("111") }) ``` ``` { id: "401671202234433613", coll: Token, ts: Time("2099-06-25T13:32:39.240Z"), document: Customer("111"), secret: "fn..." } ``` # `Token.firstWhere()` | Learn: Tokens | | --- | --- | --- | Get the first [token](../../../../learn/security/tokens/) that matches a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig Token.firstWhere(pred: (Token => Boolean)) => Token | Null ``` ## [](#description)Description Gets the first [token](../../../../learn/security/tokens/), represented as a [`Token` document](../), that matches a provided [predicate function](../../../fql/functions/#predicates). A token is a type of [authentication secret](../../../../learn/security/authentication/#secrets) used to provide identity-based access to a Fauna database. Fauna stores tokens as documents in the [`Token` system collection](../). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pred | Predicate function | Yes | Anonymous predicate function that:Accepts a Token document as its only argument. Supports shorthand-syntax.Returns a Boolean value.The method returns the first Token document for which the predicate returns true. | ## [](#return-value)Return value One of: | Type | Description | | --- | --- | --- | --- | | Token | First Token document that matches the predicate. | | Null | No Token document matches the predicate. | ## [](#examples)Examples ```fql Token.firstWhere(.document.id == "111") ``` ``` { id: "401670938431586381", coll: Token, ts: Time("2099-06-25T13:28:27.660Z"), document: Customer("111") } ``` # `Token.toString()` | Learn: Tokens | | --- | --- | --- | Get `"Token"` as a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig Token.toString() => String ``` ## [](#description)Description Returns the name of the [`Token` collection](../) as a [String](../../../fql/types/#string). ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | "Token" | ## [](#examples)Examples ```fql Token.toString() ``` ``` "Token" ``` # `Token.where()` | Learn: Tokens | | --- | --- | --- | Get a Set of [tokens](../../../../learn/security/tokens/) that match a provided [predicate](../../../fql/functions/#predicates). ## [](#signature)Signature ```fql-sig Token.where(pred: (Token => Boolean)) => Set ``` ## [](#description)Description Gets a Set of [tokens](../../../../learn/security/tokens/), represented as [`Token` documents](../), that match a provided [predicate function](../../../fql/functions/#predicates). A token is a type of [authentication secret](../../../../learn/security/authentication/#secrets) used to provide identity-based access to a Fauna database. Fauna stores tokens as documents in the [`Token` system collection](../). If `Token.where()` is the last expression in a query, the first page of the `Set` is returned. See [Pagination](../../../../learn/query/pagination/). ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | pred | Predicate function | Yes | Anonymous predicate function that:Accepts an Token document as its only argument. Supports shorthand-syntax.Returns a Boolean value.The method returns a Set of Token documents for which the predicate returns true. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Set | Set of Token documents that match the predicate. If there are no matching documents, the Set is empty. | ## [](#examples)Examples ```fql Token.where(.document.id == "111") ``` ``` { data: [ { id: "401670938431586381", coll: Token, ts: Time("2099-08-14T23:54:00.750Z"), document: Customer("111") }, ... ] } ``` # `token.delete()` | Learn: Tokens | | --- | --- | --- | Delete a [token](../../../../learn/security/tokens/). ## [](#signature)Signature ```fql-sig delete() => NullToken ``` ## [](#description)Description Deletes a [token](../../../../learn/security/tokens/), represented as a [`Token` document](../). A token is a type of [authentication secret](../../../../learn/security/authentication/#secrets) used to provide identity-based access to a Fauna database. Fauna stores tokens as documents in the [`Token` system collection](../). ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | NullToken | Document doesn’t exist. See NullDoc. | ## [](#examples)Examples ```fql Token.byId("401671202234433613")!.delete() ``` ``` Token("401671202234433613") /* deleted */ ``` # `token.exists()` | Learn: Tokens | | --- | --- | --- | Test if a [token](../../../../learn/security/tokens/) exists. ## [](#signature)Signature ```fql-sig exists() => true ``` ## [](#description)Description Tests if a [token](../../../../learn/security/tokens/), represented as an [`Token` document](../), exists. A token is a type of [authentication secret](../../../../learn/security/authentication/#secrets) used to provide identity-based access to a Fauna database. Fauna stores tokens as documents in the [`Token` system collection](../). ### [](#exists-vs-null-comparisons)`exists()` vs. null comparisons You can use either `exists()` or a null comparison (`== null` or `!= null`) to check the existence or validity of a value. For example: ```fql Token.byId("12345").exists() // true Token.byId("12345") != null // true ``` Key differences: * `exists()` returns an error if called on an unsupported value. * Null comparisons do not throw errors and work safely on any value. For example: ```fql // Declare an object. Objects don't support // an `exists()` method. let object = { a: "Foo", b: "Bar" } object.exists() // Returns `invalid_query` error object != null // Returns true ``` ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Boolean | If true, the Token document exists. If false, the Token document doesn’t exist. | ## [](#examples)Examples ```fql Token.byId("401671202234433613").exists() ``` ``` true ``` # `token.replace()` | Learn: Tokens | | --- | --- | --- | Replace a [token](../../../../learn/security/tokens/). ## [](#signature)Signature ```fql-sig replace(data: { *: Any }) => Token ``` ## [](#description)Description Replaces all fields in a [token](../../../../learn/security/tokens/), represented as a [`Token` document](../), with fields from a provided data object. Fields not present in the data object, excluding the `id`, `coll`, and `ts` metadata fields, are removed. A token is a type of [authentication secret](../../../../learn/security/authentication/#secrets) used to provide identity-based access to a Fauna database. Fauna stores tokens as documents in the [`Token` system collection](../). ### [](#metadata-fields)Metadata fields You can’t use this method to insert or edit the following [metadata fields](../../../../learn/data-model/documents/#meta): * `id` * `coll` * `ts` ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | | Fields for the Token document. Fields not present, excluding the id, coll, and ts metadata fields, in the object are removed.For supported document fields, see Token collection.The object can’t include the following metadata fields:idcollts | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Token | Token document with replaced fields. | ## [](#examples)Examples ```fql Token.byId("401670938431586381")!.replace({ document: Customer.byId("222") }) ``` ``` { id: "401670938431586381", coll: Token, ts: Time("2099-07-28T03:24:23.810Z"), document: Customer("222") } ``` # `token.update()` | Learn: Tokens | | --- | --- | --- | Update a [token](../../../../learn/security/tokens/). ## [](#signature)Signature ```fql-sig update(data: { *: Any }) => Token ``` ## [](#description)Description Updates a [token](../../../../learn/security/tokens/)'s metadata or [identity document](../../../../learn/security/tokens/#identity-document) represented as an [`Token` document](../), with fields from a provided data object. During the update, fields from the data object are copied to the document, creating new fields or updating existing fields. The operation is similar to a merge. A token is a type of [authentication secret](../../../../learn/security/authentication/#secrets) used to provide identity-based access to a Fauna database. ### [](#nested-fields)Nested fields Fields with nested objects in the data object are merged with the identically named nested object in the document. ### [](#remove-a-field)Remove a field To remove a document field, set its value in the data object to `null`. ### [](#metadata-fields)Metadata fields You can’t use this method to insert or edit the following [metadata fields](../../../../learn/data-model/documents/#meta): * `id` * `coll` * `ts` ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | data | Object | true | Document fields for the Token document.For supported document fields, see Token collection.The object can’t include the following metadata fields:* id * coll * ts | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Token | The updated Token document. | ## [](#examples)Examples ```fql Token.byId("401670938431586381")!.update({ data: { clientIpAddr: "123.123.12.1" } }) ``` ``` { id: "401670938431586381", coll: Token, ts: Time("2099-07-28T03:21:08.580Z"), document: Customer("111"), data: { clientIpAddr: "123.123.12.1" } } ``` # TransactionTime [TransactionTime](../../fql/types/#transactiontime) methods and properties. ## [](#description)Description [TransactionTime](../../fql/types/#transactiontime) methods are provided to represent an instantaneous query transaction time. ## [](#static-methods)Static methods | Method | Description | | --- | --- | --- | --- | | TransactionTime() | Get the query transaction time. | | TransactionTime.toString() | Get "[transaction time]" as a String. | # `TransactionTime()` Get the query transaction time. ## [](#signature)Signature ```fql-sig TransactionTime() => TransactionTime ``` ## [](#description)Description The [TransactionTime](../../../fql/types/#transactiontime) is a placeholder that is set when the query is committed. If the query doesn’t write, it is equivalent to the query snapshot time, [`Time.now()`](../../time/now/). ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | TransactionTime | ISO 8601 time value. | ## [](#examples)Examples This example shows the transaction time, `ts`, is the `TransactionTime()`. ```fql let doc = Customer.create({ name: "John Doe", email: "jdoe@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } }) [doc, doc.ts == TransactionTime()] ``` ``` [ { id: "412735829984673869", coll: Customer, ts: Time("2024-10-25T16:40:10.525Z"), cart: null, orders: "hdWCxmVPcmRlcoHKhGpieUN1c3RvbWVygcZidjD09oHNSgW6VQz0UABNBAD2wYIaZxvJ6hofSt1AEA==", name: "John Doe", email: "jdoe@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } }, true ] ``` # `TransactionTime.toString()` Get `"[transaction time]"` as a [String](../../../fql/types/#string). ## [](#signature)Signature ```fql-sig TransactionTime.toString() => String ``` ## [](#description)Description Returns the name of the [TransactionTime](../../../fql/types/#transactiontime) module as a [String](../../../fql/types/#string). ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | String | "[transaction time]" | ## [](#examples)Examples ```fql TransactionTime().toString() ``` ``` "[transaction time]" ``` # Global functions FQL utility functions. ## [](#description)Description The global functions aid application development and debugging and provide added database query functionality. ## [](#static-methods)Static methods | Method | Description | | --- | --- | --- | --- | | abort() | End the current query and return an abort error with a user-defined abort value. | | dbg() | Output a debug message in the query summary and return the message in the query results. | | ID() | Create a valid ID | | log() | Output a log message in the query summary and return null. | | newId() | Get a unique string-encoded 64-bit integer. | # `abort()` End the current query and return an [abort error](#error) with a user-defined `abort` value. ## [](#signature)Signature ```fql-sig abort(return: Any) => Never ``` ## [](#description)Description `abort()` lets you intentionally return an abort error in an FQL query or [user-defined function (UDF)](../../../../learn/schema/user-defined-functions/). You can pass a user-defined return value to `abort()`. For example, in a UDF: ```fsl function checkout(orderId, status, payment) { ... // Abort the query if it calls `checkout()` with a // `status` other than `processing`. if (status != "processing") { // `Abort()` accepts a user-defined return value. // The value can be of any FQL type. abort("Cannot call checkout with status other than processing.") } } ``` Calling `abort()` ends the current query, including all expressions in query, and returns an [abort error](#error). No operations in the query are committed. Changes made before the `abort()` call are discarded. ### [](#error)Abort errors Abort errors have the `abort` error code, a `Query aborted.` error message, and include the user-defined return value in the [Query HTTP API endpoint](../../../http/reference/core-api/#operation/query) response’s `error.abort` response body property: ```json { "error": { "code": "abort", "message": "Query aborted.", "abort": "Cannot call checkout with status other than processing." }, ... } ``` The return value is encoded from FQL to JSON using the [data format](../../../http/reference/wire-protocol/#formats) specified in the `X-Format` header. #### [](#query-stack-traces)Query stack traces [Abort errors](../../../http/reference/errors/#abort) include a query stack trace in the response’s [`summary`](../../../http/reference/query-summary/) field. The stack trace includes the lines that triggered the error. For example, the following FQL query includes an [`abort()`](./) call: ```fql Customer.all() abort("Discard") ``` The query as a [Query endpoint](../../../http/reference/core-api/#operation/query) request: ```bash curl -X POST \ 'https://db.fauna.com/query/1' \ -H "Authorization: Bearer $FAUNA_SECRET" \ -H 'Content-Type: application/json' \ -H 'X-Format: tagged' \ -d '{ "query": "Customer.all()\nabort(\"Discard\")" }' ``` Response: ```json { "error": { "code": "abort", "message": "Query aborted.", "abort": "Discard" }, "summary": "error: Query aborted.\nat *query*:2:6\n |\n2 | abort(\"Discard\")\n | ^^^^^^^^^^^\n |", ... } ``` When unescaped, the response’s `summary` renders as: ``` error: Query aborted. at *query*:2:6 | 2 | abort("Discard") | ^^^^^^^^^^^ | ``` #### [](#query-stack-traces-for-udf-calls)Query stack traces for UDF calls When calling a [user-defined function (UDF)](../../../../learn/schema/user-defined-functions/) that uses [`abort()`](./), the summary’s query stack trace includes both query and UDF lines if the [authentication secret](../../../../learn/security/authentication/#secrets) has the built-in [`admin`](../../../../learn/security/roles/#built-in-roles) or [`server`](../../../../learn/security/roles/#built-in-roles) role. For example, the following response’s `summary` field contains a query stack trace with both query and UDF lines: ```json { "error": { "code": "abort", "message": "Query aborted.", "abort": "Can not call checkout with status other than processing." }, "summary": "error: Query aborted.\nat *udf:checkout*:6:10\n |\n6 | abort(\"Can not call checkout with status other than processing.\")\n | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n |\nat *query*:1:9\n |\n1 | checkout(420701723228635213, \"cart\", {})\n | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n |\n\ninfo at *udf:checkout*:3: log - test123\n\ninfo: \"debug - test123\"\nat *udf:checkout*:4:6\n |\n4 | dbg(\"debug - test123\")\n | ^^^^^^^^^^^^^^^^^^^\n |", ... } ``` When unescaped, the response’s `summary` renders as: ``` error: Query aborted. at *udf:checkout*:6:10 | 6 | abort("Can not call checkout with status other than processing.") | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | at *query*:1:9 | 1 | checkout(420701723228635213, "cart", {}) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | info at *udf:checkout*:3: log - test123 info: "debug - test123" at *udf:checkout*:4:6 | 4 | dbg("debug - test123") | ^^^^^^^^^^^^^^^^^^^ | ``` For secrets with the built-in [`server-readonly`](../../../../learn/security/roles/#built-in-roles) role or a [user-defined role](../../../../learn/security/roles/#user-defined-role), the summary’s stack trace only includes query lines, not UDF lines. This applies even if the UDF is annotated with [`@role(admin)`](../../../../learn/schema/user-defined-functions/#runtime-privileges) or [`@role(server)`](../../../../learn/schema/user-defined-functions/#runtime-privileges). This prevents unprivileged users from inferring sensitive UDF logic or data. For example, the previous response’s `summary` field contains a query stack trace with only query lines: ```json { "error": { "code": "abort", "message": "Query aborted.", "abort": "Can not call checkout with status other than processing." }, "summary": "error: Query aborted.\nat *query*:1:9\n |\n1 | checkout(420701723228635213, \"cart\", {})\n | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n |", ... } ``` When unescaped, the response’s `summary` renders as: ``` error: Query aborted. at *query*:1:9 | 1 | checkout(420701723228635213, "cart", {}) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ``` #### [](#abort-errors-in-client-drivers)Abort errors in client drivers Fauna’s client drivers include classes for abort errors: * JavaScript driver: [`AbortError`](https://fauna.github.io/fauna-js/latest/classes/AbortError.html) * Python driver: [`AbortError`](https://fauna.github.io/fauna-python/latest/api/fauna/errors/errors.html#AbortErrores) * Go driver: [`ErrAbort`](https://pkg.go.dev/github.com/fauna/fauna-go/v2#ErrAbort) * .NET/C# driver: [`AbortException`](https://fauna.github.io/fauna-dotnet/latest/class_fauna_1_1_exceptions_1_1_abort_exception.html) * JVM driver: [`AbortException`](https://fauna.github.io/fauna-jvm/latest/com/fauna/exception/AbortException.html) ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | return | Any | true | User-defined value returned in the error.abort property of abort error responses.The return value is encoded from FQL to JSON using the data format specified in the Query HTTP API request's X-Format header. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Never | abort() never returns a value. Queries that call abort() always return an abort error, not query results. | ## [](#examples)Examples ### [](#basic-example)Basic example The following query contains two expressions: * A [`collection.create()`](../../collection/instance-create/) call to create a `Customer` collection document. * An `abort()` call that aborts the query. ```fql Customer.create({ name: "John Doe", email: "jdoe@example.com", address: { street: "87856 Mendota Court", city: "Washington", state: "DC", postalCode: "20220", country: "US" } }) abort("Something went wrong") // Stops the query and returns an error ``` The query returns an [abort error](#error): ``` abort: Query aborted. error: Query aborted. at *query*:12:6 | 12 | abort("Something went wrong") | ^^^^^^^^^^^^^^^^^^^^^^^^ | ``` As a [Query HTTP API endpoint](../../../http/reference/core-api/#operation/query) response using the [simple data format](../../../http/reference/wire-protocol/#simple): ```json { "error": { "code": "abort", "message": "Query aborted.", "abort": "Something went wrong" }, ... } ``` Because the query included an `abort()` call, the [`collection.create()`](../../collection/instance-create/) operation is not committed. The related document isn’t created. To verify the document wasn’t created, you can run use an [index](../../../../learn/data-model/indexes/) to run an [exact match search](../../../../learn/data-model/indexes/#exact-match) for the document: ```fql Customer.byEmail("jdoe@example.com") ``` The query returns an empty Set, indicating the document wasn’t created: ``` { data: [] } ``` ### [](#using-abort-in-udfs)Using `abort()` in UDFs `abort()` is commonly used in [user-defined function (UDF)](../../../../learn/schema/user-defined-functions/) to raise errors. You can use `abort()` to intentionally raise an error if the UDF is passed an invalid argument. For example, the following FSL [function schema](../../../fsl/function/) defines a UDF that includes several `abort()` calls nested within conditional logic: ```fsl function validateOrderStatusTransition(oldStatus, newStatus) { if (oldStatus == "cart" && newStatus != "processing") { // The order can only transition from cart to processing. abort("Invalid status transition.") } else if (oldStatus == "processing" && newStatus != "shipped") { // The order can only transition from processing to shipped. abort("Invalid status transition.") } else if (oldStatus == "shipped" && newStatus != "delivered") { // The order can only transition from shipped to delivered. abort("Invalid status transition.") } } ``` ### [](#pass-non-string-values-to-abort)Pass non-string values to `abort()` `abort()` accepts a single `return` parameter of [Any](../../../fql/types/#any) type. This value is returned in the [Query HTTP API endpoint](../../../http/reference/core-api/#operation/query) response’s `error.abort` response body property. The return value is encoded from FQL to JSON using the [data format](../../../http/reference/wire-protocol/#formats) specified in the [Query HTTP API request](../../../http/reference/core-api/#operation/query)'s `X-Format` header. #### [](#pass-an-array-value)Pass an Array value The following `abort()` call contains an [Array](../../../fql/types/#array) as a return value: ```fql abort([1, 2, 3]) ``` The query as a [Query endpoint](../../../http/reference/core-api/#operation/query) request using the [simple data format](../../../http/reference/wire-protocol/#simple): ```bash curl -X POST \ 'https://db.fauna.com/query/1' \ -H "Authorization: Bearer $FAUNA_SECRET" \ -H 'Content-Type: application/json' \ -H 'X-Format: simple' \ -d '{ "query": "abort([1, 2, 3])" }' ``` The API response: ```json { "error": { "code": "abort", "message": "Query aborted.", "abort": [ 1, 2, 3 ] }, ... } ``` The query as a [Query endpoint](../../../http/reference/core-api/#operation/query) request using the [tagged data format](../../../http/reference/wire-protocol/#tagged): ```bash curl -X POST \ 'https://db.fauna.com/query/1' \ -H "Authorization: Bearer $FAUNA_SECRET" \ -H 'Content-Type: application/json' \ -H 'X-Format: tagged' \ -d '{ "query": "abort([1, 2, 3])" }' ``` The API response: ```json { "error": { "code": "abort", "message": "Query aborted.", "abort": [ { "@int": "1" }, { "@int": "2" }, { "@int": "3" } ] }, ... } ``` #### [](#pass-a-time-value)Pass a Time value The following `abort()` call contains an [Time](../../../fql/types/#time) as a return value: ```fql abort(Time.now()) ``` The query as a [Query endpoint](../../../http/reference/core-api/#operation/query) request using the [simple data format](../../../http/reference/wire-protocol/#simple): ```bash curl -X POST \ 'https://db.fauna.com/query/1' \ -H "Authorization: Bearer $FAUNA_SECRET" \ -H 'Content-Type: application/json' \ -H 'X-Format: simple' \ -d '{ "query": "abort(Time.now())" }' ``` The API response: ```json { "error": { "code": "abort", "message": "Query aborted.", "abort": "2025-02-21T16:00:00.253411Z" }, ... } ``` The query as a [Query endpoint](../../../http/reference/core-api/#operation/query) request using the [tagged data format](../../../http/reference/wire-protocol/#tagged): ```bash curl -X POST \ 'https://db.fauna.com/query/1' \ -H "Authorization: Bearer $FAUNA_SECRET" \ -H 'Content-Type: application/json' \ -H 'X-Format: tagged' \ -d '{ "query": "abort(Time.now())" }' ``` The API response: ```json { "error": { "code": "abort", "message": "Query aborted.", "abort": { "@time": "2025-02-21T16:00:33.691141Z" } }, ... } ``` # `dbg()` Output a [debug message](../../../http/reference/query-summary/#debug) in the [query summary](../../../http/reference/query-summary/) and return the message in the query results. ## [](#signature)Signature ```fql-sig dbg(value: A) => A ``` ## [](#description)Description The `dbg()` (debug) method outputs a provided message in the [summary](../../../http/reference/query-summary/) of query responses and returns the message in the query result. In the summary, debug messages are annotated as `info`. ### [](#dbg-and-log)`dbg()` and `log()` [`dbg()`](./) is similar to [`log()`](../log/) except that `dbg()` returns its message in the actual query result. You can use [`dbg()`](./) inline within method calls for debugging. ### [](#debug-message-template)Debug message template The debug message template is: ``` info: at **:: | | dbg() | ^^^^^^^^^^^ | ``` where: | Field | Description | | --- | --- | --- | --- | | | Message source.One of: SourceDescriptionqueryThe dbg() call occurred in the main query body.udf:The dbg() call occurred in the user-defined function named | Source | Description | query | The dbg() call occurred in the main query body. | udf: | The dbg() call occurred in the user-defined function named | | Source | Description | | query | The dbg() call occurred in the main query body. | | udf: | The dbg() call occurred in the user-defined function named | | | Line number where dbg() is used. | | | Character offset in where dbg() is used. | | | String-serialized message. | ### [](#debug-messages-for-udf-calls)Debug messages for UDF calls When calling a [user-defined function (UDF)](../../../../learn/schema/user-defined-functions/) that uses [`dbg()`](./), the visibility of debug messages depends on the [authentication secret](../../../../learn/security/authentication/#secrets)'s role: | Role | Visibility | | --- | --- | --- | --- | | admin, server | Messages are returned in the response’s summary. | | server-readonly, User-defined role | Messages are not returned, even if the UDF is annotated with @role(admin) or @role(server). | ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | value | Any | true | Value to output to the query summary and query results. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Any | Returns value. | ## [](#examples)Examples ### [](#basic-example)Basic example The following FQL query uses [`dbg()`](./) within a [`collection.create()`](../../collection/instance-create/) call: ```fql let x = "key limes" Product.create( // `dbg()` outputs its message. In this case, // it outputs a struct containing the `name` // and `stock` properties. dbg({ name: "#{x}", stock: 1 + 2 + 3, }) ) ``` The query as a [Query endpoint](../../../http/reference/core-api/#operation/query) request: ```bash curl -X POST \ 'https://db.fauna.com/query/1' \ -H 'Authorization: Bearer ' \ -H 'Content-Type: application/json' \ -H 'X-Format: tagged' \ -d '{ "query": "let x = \"key limes\"\n\nProduct.create(\n dbg({\n name: \"#{x}\",\n stock: 1 + 2 + 3,\n })\n)" }' ``` The message is included in the query results in `data`. The `summary` includes the query lines that called `dbg()`: ``` { "data": { "@doc": { "id": "413921254218661965", "coll": { "@mod": "Product" }, "ts": { "@time": "2099-11-07T18:41:59.173Z" }, "name": "key limes", "stock": { "@int": "6" } } }, "static_type": "Product", "summary": "info: { name: \"key limes\", stock: 6 }\nat *query*:4:6\n |\n4 | dbg({\n | ______^\n5 | | name: \"#{x}\",\n6 | | stock: 1 + 2 + 3,\n7 | | })\n | |____^\n |", ... } ``` When unescaped, the response’s `summary` renders as: ``` info: { name: "key limes", stock: 6 } at *query*:4:6 | 4 | dbg({ | ______^ 5 | | name: "#{x}", 6 | | stock: 1 + 2 + 3, 7 | | }) | |____^ | ``` ### [](#output-a-field-value)Output a field value ```fql Product.create({ name: "debug1", stock: dbg(1 + 2 + 3), }) ``` ``` info: 6 at *query*:3:13 | 3 | stock: dbg(1 + 2 + 3), | ^^^^^^^^^^^ | { id: "394873023799230528", coll: Product, ts: Time("2099-04-11T12:38:31.050Z"), name: "debug1", stock: 6 } ``` ### [](#output-an-object)Output an object ```fql Product.create( dbg({ name: "debug2", stock: 1 + 2 + 3, }) ) ``` ``` info: { name: "debug2", stock: 6 } at *query*:2:6 | 2 | dbg({ | ______^ 3 | | name: "debug2", 4 | | stock: 1 + 2 + 3, 5 | | }) | |____^ | { id: "394873104675897408", coll: Product, ts: Time("2099-04-11T12:39:48.180Z"), name: "debug2", stock: 6 } ``` ### [](#output-a-document)Output a document ```fql dbg( Product.create({ name: "debug3", stock: 1 + 2 + 3, }) ) ``` ``` info: { id: ID("394873211262599234"), coll: Product, ts: TransactionTime(), name: "debug3", stock: 6 } at *query*:1:4 | 1 | dbg( | ____^ 2 | | Product.create({ 3 | | name: "debug3", 4 | | stock: 1 + 2 + 3, 5 | | }) 6 | | ) | |_^ | { id: "394873211262599234", coll: Product, ts: Time("2099-04-11T12:41:29.835Z"), name: "debug3", stock: 6 } ``` ## [](#see-also)See also [`log()`](../log/) [Query summary](../../../http/reference/query-summary/) # `ID()` Create a valid [ID](../../../fql/types/#id) ## [](#signature)Signature ```fql-sig ID(id: String) => ID ID(id: Number) => ID ``` ## [](#description)Description The `ID()` method returns a valid [ID](../../../fql/types/#id) given a [String](../../../fql/types/#string) or [Int](../../../fql/types/#int) representation of the ID. ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | id | String or Int | true | Document identifier. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | ID | Resource identifier. | ## [](#examples)Examples ```fql ID(123) ``` ``` "123" ``` ```fql ID("123") ``` ``` "123" ``` ## [](#see-also)See also [`collection.create()`](../../collection/instance-create/) # `log()` Output a [log message](../../../http/reference/query-summary/#log) in the [query summary](../../../http/reference/query-summary/) and return `null`. ## [](#signature)Signature ```fql-sig log(args: ...Any) => Null ``` ## [](#description)Description The `log()` method outputs `args` containing a message to the [summary](../../../http/reference/query-summary/) of query responses. In the summary, log messages are annotated as `info`. [`log()`](./) is similar to `console.log()` or `print()` in other programming languages. ### [](#log-and-dbg)`log()` and `dbg()` [`log()`](./) is similar to [`dbg()`](../dbg/) except that `log()` does not return its message in the query results. ### [](#log-message-template)Log message template The log message template is: ``` info at **:: ``` where: | Field | Description | | --- | --- | --- | --- | | | Message source.One of: SourceDescriptionqueryThe log() call occurred in the main query body.udf:The log() call occurred in the user-defined function named | Source | Description | query | The log() call occurred in the main query body. | udf: | The log() call occurred in the user-defined function named | | Source | Description | | query | The log() call occurred in the main query body. | | udf: | The log() call occurred in the user-defined function named | | | Line number where log() is used. | | | String-serialized message. | ### [](#log-messages-for-udf-calls)Log messages for UDF calls When calling a [user-defined function (UDF)](../../../../learn/schema/user-defined-functions/) that uses [`log()`](./), the visibility of log messages depends on the [authentication secret](../../../../learn/security/authentication/#secrets)'s role: | Role | Visibility | | --- | --- | --- | --- | | admin, server | Messages are returned in the response’s summary. | | server-readonly, User-defined role | Messages are not returned, even if the UDF is annotated with @role(admin) or @role(server). | ## [](#parameters)Parameters | Parameter | Type | Required | Description | | --- | --- | --- | --- | --- | --- | | args | Any | | Values to output to the query summary. Supports interpolated strings containing FQL variables. | ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | Null | After output, the args are discarded. | ## [](#examples)Examples The following FQL query logs several `summary` messages: ```fql log("Before assignment") let x = 5 let y = { lat: 37.5542782, long: -122.3007394 } log("After assignment x=#{x}") log(y) x ``` The query as a [Query endpoint](../../../http/reference/core-api/#operation/query) request: ```bash curl -X POST \ 'https://db.fauna.com/query/1' \ -H 'Authorization: Bearer ' \ -H 'Content-Type: application/json' \ -H 'X-Format: tagged' \ -d '{ "query": "log(\"Before assignment\")\nlet x = 5\nlet y = { lat: 37.5542782, long: -122.3007394 }\nlog(\"After assignment x=#{x}\")\nlog(y)\nx\n" }' ``` Unlike [`dbg()`](../dbg/), [`log()`](./) does not return a value. The message is excluded from the query results in `data`. The `summary` includes the query lines that called `log()`: ``` { "data": { "@int": "5" }, "static_type": "5", "summary": "info at *query*:1: Before assignment\n\ninfo at *query*:4: After assignment x=5\n\ninfo at *query*:5: { lat: 37.5542782, long: -122.3007394 }", ... } ``` When unescaped, the response’s `summary` renders as: ``` "info at *query*:1: Before assignment info at *query*:4: After assignment x=5 info at *query*:5: { lat: 37.5542782, long: -122.3007394 } ``` ## [](#see-also)See also [`dbg()`](../dbg/) [Query summary](../../../http/reference/query-summary/) # `newId()` Get a unique string-encoded 64-bit integer. ## [](#signature)Signature ```fql-sig newId() => ID ``` ## [](#description)Description The `newId()` method returns a number that is guaranteed to be unique across all databases and is suitable for constructing the document ID part of a document reference. Document IDs are generated using the [Twitter Snowflake algorithm](https://blog.twitter.com/engineering/en_us/a/2010/announcing-snowflake). The IDs are based on time instead of sequential numbers and are generally increasing. The `newId()` method shouldn’t be used to generate random numbers. ## [](#parameters)Parameters None ## [](#return-value)Return value | Type | Description | | --- | --- | --- | --- | | ID | Numeric value that is unique across databases. | ## [](#examples)Examples ```fql newId() ``` ``` "374494810266927169" ``` ## [](#see-also)See also [`collection.create()`](../../collection/instance-create/) # FSL reference | Learn: Schema | | --- | --- | --- | This guide covers Fauna Schema Language (FSL) syntax and API. For specific FSL schema definitions, see: * [FSL access provider schema](access-provider/) * [FSL collection schema](collection/) * [FSL function schema](function/) * [FSL role schema](role/) ## [](#comments)Comments FSL supports single-line and block comments as described in the FQL [comments](../fql/lexical/#comments) section of the language reference. ## [](#property)Property definition Property definition syntax has one of the following forms: | Syntax | Description | | --- | --- | --- | --- | | | Sets the property value for the item. | | | Relationship between the enclosing schema item and the referenced item, indicating existence. | | { } | Relationship between the enclosing schema item and the referenced item, indicating existence and given a value. | Properties can be unique for a schema item or can be repeated. ## [](#cross-reference)Cross-reference An FSL schema can reference another schema by name. For example, a role schema can reference a collection by name: ```fsl // Schema for the `Customer` collection. collection Customer { ... } ... // Schema for the `customer` role. role customer { // The role references the above `Customer` collection. privileges Customer { read } } ``` # FSL access provider schema | Learn: Access providers | | --- | --- | --- | An FSL access provider schema defines an [access provider](../../../learn/security/access-providers/). An access provider registers an external identity provider (IdP), such as Auth0, in your Fauna database. ```fsl access provider someIssuer { issuer "https://example.com/" jwks_uri "https://example.com/.well-known/jwks.json" role customer } ``` Once [set up](../../../learn/security/access-providers/#config), the IdP can issue JSON Web Tokens (JWTs) that act as Fauna [authentication secrets](../../../learn/security/authentication/#secrets). This lets your application’s end users use the IdP for authentication. You can create and manage schema using any of the following: * The [Fauna CLI](../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../learn/schema/manage-schema/#fql) Fauna stores each access provider schema as an FQL document in the [`AccessProvider`](../../fql-api/accessprovider/) system collection. ## [](#fsl-syntax)FSL syntax ```fsl-sig access provider { issuer "" jwks_uri "" [role [{ predicate }] . . .] } ``` ## [](#name)Name _access provider_ **Required** Unique name for the access provider in the database. Must begin with a letter. Can only include letters, numbers, and underscores. ## [](#properties)Properties | Property | Required | Description | | --- | --- | --- | --- | --- | | issuer | true | Issuer for the IdP’s JWTs. Must match the iss claim in JWTs issued by the IdP.The issuer URL. This tells Fauna which IdP is permitted to send a JWT to authorize a query to be executed. | | jwks_uri | true | URI that points to public JSON web key sets (JWKS) for JWTs issued by the IdP. Fauna uses the keys to verify each JWT’s signature. | | role | | User-defined role assigned to JWTs issued by the IdP. Can’t be a built-in role.An access provider can have multiple role properties.Each role property can include a predicate function. If present, JWTs are only assigned the role if the predicate evaluates to true.The predicate function is passed one argument: an object containing the JWT’s payload. The predicate function does not support shorthand syntax. | ## [](#examples)Examples ```fsl access provider someIssuer { issuer "https://example.com/" jwks_uri "https://example.com/.well-known/jwks.json" role customer role manager { predicate (jwt => jwt!.scope.includes("manager")) } } ``` # FSL collection schema | Learn: Schema | | --- | --- | --- | An FSL collection schema defines the structure and behavior of a user-defined [collection](../../../learn/data-model/collections/) and its [documents](../../../learn/data-model/documents/). ```fsl collection Product { // Field definitions. // Define the structure of the collection's documents. name: String? description: String? price: Int = 0 stock: Int = 0 creationTime: Time = Time.now() creationTimeEpoch: Int? typeConflicts: { *: Any }? // Wildcard constraint. // Allows or disallows arbitrary ad hoc fields. *: Any // Migrations block. // Used for schema migrations. // Instructs Fauna how to handle updates to a collection's // field definitions and wildcard constraint. // Contains imperative migration statements. migrations { add .typeConflicts add .stock add_wildcard backfill .stock = 0 drop .internalDesc move_conflicts .typeConflicts move .desc -> .description split .creationTime -> .creationTime, .creationTimeEpoch } // Index definition. // You use indexes to filter and sort documents // in a performant way. index byName { terms [.name] values [desc(.stock), desc(mva(.categories))] } // Unique constraint. // Ensures a field value or combination of field values // is unique for each document in the collection. // Supports multivalue attribute (`mva`) fields, such as Arrays. unique [.name, .description, mva(.categories)] // Check constraint. // Ensures a field value meets provided criteria // before writes. Written as FQL predicate functions. check posStock ((doc) => doc.stock >= 0) // Computed field. // A document field that derives its value from a // user-defined, read-only FQL function that runs on every read. compute InventoryValue: Number = (.stock * .price) // Controls whether you can write to the `ttl` field for collection // documents. If the collection schema doesn't contain field // definitions, `document_ttls` defaults to `true`. Otherwise, // `document_ttls` defaults to `false`. document_ttls true // Sets the default `ttl` for documents in days from their creation // timestamp. You can override the default ttl` during document // creation. ttl_days 5 // Controls document history retention. history_days 3 } ``` You can create and manage schema using any of the following: * The [Fauna CLI](../../../learn/schema/manage-schema/#staged) * The [Fauna Dashboard](https://dashboard.fauna.com/) * The Fauna Core HTTP API’s [Schema endpoints](../../http/reference/core-api/#tag/Schema) * [FQL schema methods](../../../learn/schema/manage-schema/#fql) Fauna stores each collection schema as an FQL document in the [`Collection`](../../fql-api/collection/) system collection. ## [](#fsl-syntax)FSL syntax ```fsl-sig [@alias(] collection { [: . . .] [migrations ] [history_days ] [document_ttls ] [ttl_days