Polkadot Mapping
Polkadot Mapping
Mapping functions define how chain data is transformed into the optimised GraphQL entities that we have previously defined in the schema.graphql
file.
- Mappings are defined in the
src/mappings
directory and are exported as a function. - These mappings are also exported in
src/index.ts
. - The mappings files are referenced in
project.ts
under the mapping handlers.
There are different classes of mappings functions for Polkadot/Substrate; Block handlers, Event Handlers, and Call Handlers.
Block Handler
You can use block handlers to capture information each time a new block is attached to the chain, e.g. block number. To achieve this, a defined BlockHandler will be called once for every block.
Using block handlers slows your project down as they can be executed with each and every block - only use if you need to.
import { SubstrateBlock } from "@subql/types";
export async function handleBlock(block: SubstrateBlock): Promise<void> {
// Create a new BlockEntity with the block hash as it's ID
const record = new BlockEntity(block.block.header.hash.toString());
record.field1 = block.block.header.number.toNumber();
await record.save();
}
A SubstrateBlock
is an extended interface type of signedBlock, but also includes the specVersion
and timestamp
.
Event Handler
You can use event handlers to capture information when certain events are included on a new block. The events that are part of the default runtime and a block may contain multiple events.
During the processing, the event handler will receive an event as an argument with the event's typed inputs and outputs. Any type of event will trigger the mapping, allowing activity with the data source to be captured. You should use Mapping Filters in your manifest to filter events to reduce the time it takes to index data and improve mapping performance.
import { SubstrateEvent } from "@subql/types";
export async function handleEvent(event: SubstrateEvent): Promise<void> {
const {
event: {
data: [account, balance],
},
} = event;
const record = new EventEntity(
event.extrinsic.block.block.header.hash.toString(),
);
record.field2 = account.toString();
record.field3 = (balance as Balance).toBigInt();
await record.save();
}
A SubstrateEvent
is an extended interface type of the EventRecord. Besides the event data, it also includes an id
(the block to which this event belongs) and the extrinsic inside of this block.
Note
From @subql/types
version X.X.X
onwards SubstrateEvent
is now generic. This can provide you with higher type safety when developing your project.
async function handleEvmLog(event: SubstrateEvent<[EvmLog]>): Promise<void> {
// `eventData` will be of type `EvmLog` before it would have been `Codec`
const [eventData] = original.event.data;
}
Call Handler
Call handlers are used when you want to capture information on certain substrate extrinsics. You should use Mapping Filters in your manifest to filter calls to reduce the time it takes to index data and improve mapping performance.
export async function handleCall(extrinsic: SubstrateExtrinsic): Promise<void> {
const record = new CallEntity(extrinsic.block.block.header.hash.toString());
record.field4 = extrinsic.block.timestamp;
await record.save();
}
The SubstrateExtrinsic extends GenericExtrinsic. It is assigned an id
(the block to which this extrinsic belongs) and provides an extrinsic property that extends the events among this block. Additionally, it records the success status of this extrinsic.
Note
From @subql/types
version X.X.X
onwards SubstrateExtrinsic
is now generic. This can provide you with higher type safety when developing your project.
async function handleEvmCall(
call: SubstrateExtrinsic<[TransactionV2 | EthTransaction]>,
): Promise<void> {
// `tx` will be of type `TransactionV2 | EthTransaction` before it would have been `Codec`
const [tx] = original.extrinsic.method.args;
}
Third-party Library Support - the Sandbox
SubQuery is deterministic by design, that means that each SubQuery project is guaranteed to index the same data set. This is a critical factor that is makes it possible to verify data in the decentralised SubQuery Network. This limitation means that in default configuration, the indexer is by default run in a strict virtual machine, with access to a strict number of third party libraries.
You can easily bypass this limitation however, allowing you to retrieve data from external API endpoints, non historical RPC calls, and import your own external libraries into your projects. In order to do so, you must run your project in unsafe
mode, you can read more about this in the references. An easy way to do this while developing (and running in Docker) is to add the following line to your docker-compose.yml
:
subquery-node:
image: onfinality/subql-node:latest
...
command:
- -f=/app
- --db-schema=app
- --unsafe
...
When run in unsafe
mode, you can import any custom libraries into your project and make external API calls using tools like node-fetch. A simple example is given below:
import { SubstrateEvent } from "@subql/types";
import fetch from "node-fetch";
export async function handleEvent(event: SubstrateEvent): Promise<void> {
const httpData = await fetch("https://api.github.com/users/github");
logger.info(`httpData: ${JSON.stringify(httpData.body)}`);
// Do something with this data
}
By default (when in safe mode), the VM2 sandbox only allows the following:
- only some certain built-in modules, e.g.
assert
,buffer
,crypto
,util
andpath
- third-party libraries written by CommonJS.
- hybrid libraries like
@polkadot/*
that uses ESM as default. However, if any other libraries depend on any modules in ESM format, the virtual machine will NOT compile and return an error. - Historical/safe queries, see RPC Calls.
- external
HTTP
andWebSocket
connections are forbidden
Modules and Libraries
To improve SubQuery's data processing capabilities, we have allowed some of the NodeJS's built-in modules for running mapping functions in the sandbox, and have allowed users to call third-party libraries.
Please note this is an experimental feature and you may encounter bugs or issues that may negatively impact your mapping functions. Please report any bugs you find by creating an issue in GitHub.
Built-in modules
Currently, we allow the following NodeJS modules: assert
, buffer
, crypto
, util
, and path
.
Rather than importing the whole module, we recommend only importing the required method(s) that you need. Some methods in these modules may have dependencies that are unsupported and will fail on import.
import { hashMessage } from "ethers/lib/utils"; // Good way
import { utils } from "ethers"; // Bad way
Query States
Our goal is to cover all data sources for users for mapping handlers (more than just the three interface event types above). Therefore, we have exposed some of the @polkadot/api interfaces to increase capabilities.
These are the interfaces we currently support:
- api.query.<module>.<method>() will query the current block.
- api.query.<module>.<method>.multi() will make multiple queries of the same type at the current block.
- api.queryMulti() will make multiple queries of different types at the current block.
These are the interfaces we do NOT support currently:
api.tx.*api.derive.*api.query.<module>.<method>.atapi.query.<module>.<method>.entriesAtapi.query.<module>.<method>.entriesPagedapi.query.<module>.<method>.hashapi.query.<module>.<method>.keysAtapi.query.<module>.<method>.keysPagedapi.query.<module>.<method>.rangeapi.query.<module>.<method>.sizeAt
See an example of using this API in our validator-threshold example use case.
RPC calls
We also support some API RPC methods that are remote calls that allow the mapping function to interact with the actual node, query, and submission.
Documents in JSON-RPC provide some methods that take BlockHash
as an input parameter (e.g. at?: BlockHash
), which are now permitted. We have also modified these methods to take the current indexing block hash by default.
// Let's say we are currently indexing a block with this hash number
const blockhash = `0x844047c4cf1719ba6d54891e92c071a41e3dfe789d064871148e9d41ef086f6a`;
// Original method has an optional input is block hash
const b1 = await api.rpc.chain.getBlock(blockhash);
// It will use the current block has by default like so
const b2 = await api.rpc.chain.getBlock();
- For Custom Substrate Chains RPC calls, see usage.
Custom Substrate Chains
SubQuery can be used on any Substrate-based chain, not just Polkadot or Kusama.
You can use a custom Substrate-based chain and we provide tools to import types, interfaces, and additional methods automatically using @polkadot/typegen.
In the following sections, we use our kitty example to explain the integration process.
Preparation
Create a new directory api-interfaces
under the project src
folder to store all required and generated files. We also create an api-interfaces/kitties
directory as we want to add decoration in the API from the kitties
module.
Metadata
We need metadata to generate the actual API endpoints. In the kitty example, we use an endpoint from a local testnet, and it provides additional types. Follow the steps in PolkadotJS metadata setup to retrieve a node's metadata from its HTTP endpoint.
curl -H "Content-Type: application/json" -d '{"id":"1", "jsonrpc":"2.0", "method": "state_getMetadata", "params":[]}' http://localhost:9933
or from its websocket endpoint with help from websocat
:
//Install the websocat
brew install websocat
//Get metadata
echo state_getMetadata | websocat 'ws://127.0.0.1:9944' --jsonrpc
Next, copy and paste the output to a JSON file. In our kitty example, we have created api-interface/kitty.json
.
Type definitions
We assume that the user knows the specific types and RPC support from the chain, and it is defined in the Manifest.
Following types setup, we create :
src/api-interfaces/definitions.ts
- this exports all the sub-folder definitions
export { default as kitties } from "./kitties/definitions";
src/api-interfaces/kitties/definitions.ts
- type definitions for the kitties module
export default {
// custom types
types: {
Address: "AccountId",
LookupSource: "AccountId",
KittyIndex: "u32",
Kitty: "[u8; 16]",
},
// custom rpc : api.rpc.kitties.getKittyPrice
rpc: {
getKittyPrice: {
description: "Get Kitty price",
params: [
{
name: "at",
type: "BlockHash",
isHistoric: true,
isOptional: false,
},
{
name: "kittyIndex",
type: "KittyIndex",
isOptional: false,
},
],
type: "Balance",
},
},
};
Packages
- In the
package.json
file, make sure to add@polkadot/typegen
as a development dependency and@polkadot/api
as a regular dependency (ideally the same version). We also needts-node
as a development dependency to help us run the scripts. - We add scripts to run both types;
generate:defs
and metadatagenerate:meta
generators (in that order, so metadata can use the types).
Here is a simplified version of package.json
. Make sure in the scripts section the package name is correct and the directories are valid.
{
"name": "kitty-birthinfo",
"scripts": {
"generate:defs": "ts-node --skip-project node_modules/.bin/polkadot-types-from-defs --package kitty-birthinfo/api-interfaces --input ./src/api-interfaces",
"generate:meta": "ts-node --skip-project node_modules/.bin/polkadot-types-from-chain --package kitty-birthinfo/api-interfaces --endpoint ./src/api-interfaces/kitty.json --output ./src/api-interfaces --strict"
},
"dependencies": {
"@polkadot/api": "^4.9.2"
},
"devDependencies": {
"typescript": "^4.1.3",
"@polkadot/typegen": "^4.9.2",
"ts-node": "^8.6.2"
}
}
Type generation
Now that preparation is completed, we are ready to generate types and metadata. Run the commands below:
# Yarn to install new dependencies
yarn
# Generate types
yarn generate:defs
In each modules folder (eg /kitties
), there should now be a generated types.ts
that defines all interfaces from this modules' definitions, also a file index.ts
that exports them all.
# Generate metadata
yarn generate:meta
This command will generate the metadata and a new api-augment for the APIs. As we don't want to use the built-in API, we will need to replace them by adding an explicit override in our tsconfig.json
. After the updates, the paths in the config will look like this (without the comments):
{
"compilerOptions": {
// this is the package name we use (in the interface imports, --package for generators) */
"kitty-birthinfo/*": ["src/*"],
// here we replace the @polkadot/api augmentation with our own, generated from chain
"@polkadot/api/augment": ["src/interfaces/augment-api.ts"],
// replace the augmented types with our own, as generated from definitions
"@polkadot/types/augment": ["src/interfaces/augment-types.ts"]
}
}
Usage
Now in the mapping function, we can show how the metadata and types actually decorate the API. The RPC endpoint will support the modules and methods we declared above. And to use custom rpc call, please see section Custom chain rpc calls
export async function kittyApiHandler(): Promise<void> {
//return the KittyIndex type
const nextKittyId = await api.query.kitties.nextKittyId();
// return the Kitty type, input parameters types are AccountId and KittyIndex
const allKitties = await api.query.kitties.kitties("xxxxxxxxx", 123);
logger.info(`Next kitty id ${nextKittyId}`);
//Custom rpc, set undefined to blockhash
const kittyPrice = await api.rpc.kitties.getKittyPrice(
undefined,
nextKittyId,
);
}
If you wish to publish this project to our explorer, please include the generated files in src/api-interfaces
.
Custom Chain RPC calls
To support customised chain RPC calls, we must manually inject RPC definitions for typesBundle
, allowing per-spec configuration. You can define the typesBundle
in the project.yml
. And please remember only isHistoric
type of calls are supported.
...
types: {
"KittyIndex": "u32",
"Kitty": "[u8; 16]",
}
typesBundle: {
spec: {
chainname: {
rpc: {
kitties: {
getKittyPrice:{
description: string,
params: [
{
name: 'at',
type: 'BlockHash',
isHistoric: true,
isOptional: false
},
{
name: 'kittyIndex',
type: 'KittyIndex',
isOptional: false
}
],
type: "Balance",
}
}
}
}
}
}