Back to Examples
- Binding patterns
- Typed binding pattern
- Wildcard binding pattern
- List binding patterns
- Rest binding pattern in list binding pattern
- Mapping binding pattern
- Rest binding pattern in mapping binding pattern
- Error binding pattern
- Rest binding pattern in error binding pattern
- Single use of typed binding patterns
- Single use of typed binding patterns with on fail clause
- Iterative use of typed binding patterns
- List binding pattern in match statement
- Mapping binding pattern in match statement
- Error binding pattern in match statement
- Query expressions
- Sort iterable objects
- Let clause
- Limit clause
- Join iterable objects
- Outer Join clause
- Query tables
- Create tables with a query
- Create maps with a query
- Create streams with a query
- On conflict clause
- Advanced conflict handling
- Iterate over XML with a query
- Nested query expressions
- Destructure records using a query
- Querying streams
- Aggregation
- JSON type
- Access JSON elements
- Access optional JSON elements
- Match statement with maps
- Convert from user-defined type to JSON
- Convert from table and XML to JSON
- Convert from JSON to user-defined type
- Cast JSON to user-defined type
- Resource method typing
- JSON numbers
- JSON to record
- JSON to record with projection
- JSONPath expressions
- Asynchronous function calls
- Named workers
- Sequence diagrams
- Wait for workers
- Strands
- Named worker return values
- Alternate wait
- Multiple wait
- Named workers and futures
- Inter-worker message passing
- Alternate receive
- Multiple receive
- Conditional send
- Inter-worker failure propagation
- Named worker with on fail clause
- Synchronize message passing
- Asynchronize message passing
- Flush
- Fork
GraphQL service - Dataloader
The Ballerina GraphQL module provides the capability to batch and cache data fetching from data sources using the graphql.dataloader
submodule. To leverage this functionality in a GraphQL service, you must register data loaders through the graphql:Context
object and implement the corresponding prefetch method logic and resolver method logic.
A prefetch method, in the context of Ballerina GraphQL, is a method that is invoked before the actual resolver method. By default, the prefetch method name follows the convention of having the prefix pre
, followed by the resolver method name. The use of graphql.dataloader
avoids excessive data fetching, effectively addressing the GraphQL N+1 problem.
import ballerina/graphql;
import ballerina/graphql.dataloader;
import ballerina/http;
import ballerina/log;
public type Book record {|
int id;
string title;
|};
type BookRow record {|
readonly int id;
string title;
int author;
|};
type AuthorRow record {|
readonly int id;
string name;
|};
final readonly & table<AuthorRow> key(id) authorTable = table [
{id: 1, name: "J.K. Rowling"},
{id: 2, name: "Stephen King"}
];
final readonly & table<BookRow> key(id) bookTable = table [
{id: 1, title: "Harry Potter and the Sorcerer's Stone", author: 1},
{id: 2, title: "The Shining", author: 2},
{id: 3, title: "Harry Potter and the Chamber of Secrets", author: 1},
{id: 4, title: "It", author: 2},
{id: 5, title: "Harry Potter and the Prisoner of Azkaban", author: 1},
{id: 6, title: "The Stand", author: 2}
];
// Implement the `batch load` function for the data loader.
// This function handles fetching data in batches. The primary keys of the data to be loaded
// will be provided as the values of the `ids` parameter to the batch load function. The expected output
// of this function is an array of results in which each element in the result array corresponds
// to a primary key from the `ids` array.
isolated function bookLoaderFunction(readonly & anydata[] ids) returns BookRow[][]|error {
final int[] keys = check ids.ensureType();
log:printInfo("executing bookLoaderFunction", keys = keys);
// Implement the batching logic.
return keys.'map(isolated function(readonly & int key) returns BookRow[] {
return bookTable.'filter(book => book.author == key).toArray();
});
}
@graphql:ServiceConfig {
contextInit
}
service /graphql on new graphql:Listener(9090) {
resource function get authors() returns Author[] {
return from AuthorRow authorRow in authorTable
select new (authorRow);
}
}
public isolated distinct service class Author {
private final readonly & AuthorRow author;
isolated function init(readonly & AuthorRow author) {
self.author = author;
}
isolated resource function get name() returns string {
return self.author.name;
}
// Define a prefetch function.
isolated function preBooks(graphql:Context ctx) {
// Obtain the dataloader from the context using the unique name.
// Providing an invalid name here will lead to a panic.
dataloader:DataLoader bookLoader = ctx.getDataLoader("bookLoader");
// Add the primary key of the data to be fetched to the dataloader.
bookLoader.add(self.author.id);
}
isolated resource function get books(graphql:Context ctx) returns Book[]|error {
// Obtain the dataloader from the context using the unique name.
dataloader:DataLoader bookLoader = ctx.getDataLoader("bookLoader");
// Obtain the data from the dataloader using the primary key of the data.
BookRow[] bookrows = check bookLoader.get(self.author.id);
return from BookRow bookRow in bookrows
select {id: bookRow.id, title: bookRow.title};
}
}
isolated function contextInit(http:RequestContext requestContext, http:Request request) returns graphql:Context {
graphql:Context ctx = new;
// Register the dataloader with the context using a unique name.
// A defult implementation of the dataloader is used here.
ctx.registerDataLoader("bookLoader", new dataloader:DefaultDataLoader(bookLoaderFunction));
return ctx;
}
Run the service by executing the following command.
$ bal run graphql_dataloader.baltime = 2023-08-08T13:58:40.479+05:30 level = INFO module = "" message = "executing bookLoaderFunction" keys = [1,2]
Send the following document to the GraphQL endpoint to test the service.
{
authors {
name
books {
title
}
}
}
To send the document, execute the following cURL command in a separate terminal.
$ curl 'http://localhost:9090/graphql' -H 'Content-Type: application/json' -d '{"query": "{ authors { name books { title } } }"}'{"data":{"authors":[{"name":"J.K. Rowling", "books":[{"title":"Harry Potter and the Sorcerer's Stone"}, {"title":"Harry Potter and the Chamber of Secrets"}, {"title":"Harry Potter and the Prisoner of Azkaban"}]}, {"name":"Stephen King", "books":[{"title":"The Shining"}, {"title":"It"}, {"title":"The Stand"}]}]}}