This is the full developer documentation for Aptos Docs
# Build the Future of Web3 on Aptos
> Everything you need to build a best-in-class Web3 experience.
Features
* [Keyless](/build/guides/aptos-keyless)
* [Passkeys](https://github.com/aptos-foundation/AIPs/blob/main/aips/aip-66.md)
* [On-chain Randomness](/build/smart-contracts/randomness)
* [Gas and Storage Fees](/network/blockchain/gas-txn-fee)
* [Parallel Execution](/network/blockchain/execution)
* [Fee Payer](/build/sdks/ts-sdk/building-transactions/sponsoring-transactions)
Tooling
* [Aptos CLI](/build/cli)
* [Indexer API](/build/indexer/indexer-api)
* [Official SDKs](/build/sdks)
* [Testnet Faucet](/network/faucet)
* [Faucet API](/build/apis/faucet-api)
Resources
* [Apply for a Grant](https://aptosfoundation.org/grants)
* [Aptos Learn](https://learn.aptoslabs.com)
* [Ecosystem Projects](https://aptosfoundation.org/ecosystem/projects)
Connect
* [Developer Discussions](https://github.com/aptos-labs/aptos-developer-discussions/discussions)
* [Discussion Forum](https://forum.aptosfoundation.org)
* [Discord](https://discord.gg/aptosnetwork)
* [Telegram](https://t.me/aptos)
# Aptos MCP
# Aptos Model Context Protocol (MCP)
[Section titled “Aptos Model Context Protocol (MCP)”](#aptos-model-context-protocol-mcp)
The Aptos Model Context Protocol (MCP) is a server that provides a set of tools, prompts, and resources to help developers build applications on the Aptos blockchain. It is designed to be used with AI tools like Cursor, Claude Code, and others that support the Model Context Protocol.
## Getting Started
[Section titled “Getting Started”](#getting-started)
We’ve provided guides for Cursor and Claude Code to help you integrate the Aptos MCP into your development environment. If you’re using a different AI tool, follow the steps for your favorite AI tool, and refer to the documentation for Cursor or Claude Code for examples.
[Claude Code ](aptos-mcp/claude)Set up for Claude Code
[Cursor ](aptos-mcp/cursor)Set up for Cursor
# Aptos MCP & Claude Code
## Set up Claude Code with Aptos MCP
[Section titled “Set up Claude Code with Aptos MCP”](#set-up-claude-code-with-aptos-mcp)
1. Install the `claude-code` package
```bash
npm install -g @anthropic-ai/claude-code
```
2. Locate where Claude Code stores its configuration, usually on Mac it is at `~/.claude.json`
3. Edit the `mcpServers` object in the `json` file with
```json
{
"mcpServers": {
"aptos-mcp": {
"command": "npx",
"args": ["-y", "@aptos-labs/aptos-mcp"],
"type": "stdio",
"env": {
"APTOS_BOT_KEY": ""
}
}
}
}
```
4. Obtain your `APTOS_BOT_KEY`:
* Visit the [Aptos Developer Portal](https://aptos.dev) and log in with your account.
* Navigate to the API Keys section and create a new key.
* Copy the generated key for use in the next step.
5. Make sure to update the `APTOS_BOT_KEY` with the key you generated in the previous step.
6. Navigate to your project
```bash
cd your-awesome-project
```
7. In a new terminal window type:
```bash
claude
```
8. You can now use Claude Code to interact with the Aptos MCP. Prompt the agent with `what aptos mcp version are you using?` to verify the connection. The agent should reply with something like:
```text
I'm using Aptos MCP version 0.0.2.
```
# Aptos MCP & Cursor
# Set up Cursor with Aptos MCP
[Section titled “Set up Cursor with Aptos MCP”](#set-up-cursor-with-aptos-mcp)
1. Open the Cursor IDE
2. On the project root folder, create a `.cursor` folder
3. In the `.cursor` folder, create a `mcp.json` file
4. Paste this content
```json
{
"mcpServers": {
"aptos-mcp": {
"command": "npx",
"args": ["-y", "@aptos-labs/aptos-mcp"],
"env": {
"APTOS_BOT_KEY": ""
}
}
}
}
```
5. Obtain your `APTOS_BOT_KEY`:
* Visit the [Aptos Developer Portal](https://aptos.dev) and log in with your account.
* Navigate to the API Keys section and generate a new key.
* Copy the generated key for use in the next step.
6. Make sure to update the `APTOS_BOT_KEY` in the `mcp.json` file with the key you just generated.
### Verify Cursor runs your MCP
[Section titled “Verify Cursor runs your MCP”](#verify-cursor-runs-your-mcp)
1. Open Cursor Settings: `cursor -> settings -> cursor settings`
2. Head to the `MCP` or `Tools & Integrations` section
3. Make sure it is enabled and showing a green color indicator

4. Click the “refresh” icon to update the MCP.
5. Make sure the Cursor AI window dropdown is set to `Agent`

6. Prompt the agent with `what aptos mcp version are you using?` to verify the connection. The agent should reply with something like:

# Aptos Improvement Proposals (AIPs)
Aptos Improvement Proposals (AIPs) are a way for the Aptos community to propose changes, improvements, and new features to the Aptos protocol. AIPs are designed to be a collaborative process that allows anyone in the community to contribute ideas and feedback.
AIPs are documented in the [AIPs repository](https://github.com/aptos-foundation/AIPs) and are administered by the Aptos Foundation. Each AIP is assigned a unique number and goes through a rigorous review process before it is accepted or rejected.
## What do AIPs cover?
[Section titled “What do AIPs cover?”](#what-do-aips-cover)
AIPs can cover a wide range of topics, including:
* Node protocol changes - Mempool changes, consensus changes, etc.
* Framework (smart contract) changes - New modules, new functions, etc.
* Governance changes - Changes to the way the Aptos Foundation operates, changes to the way AIPs are processed, etc.
## What is this section of the docs mostly about?
[Section titled “What is this section of the docs mostly about?”](#what-is-this-section-of-the-docs-mostly-about)
This section of the docs is mostly about AIPs that are relevant to developers and providing FAQs and quick information about them.
# AIP-115 - Stateless Accounts
[AIP-115](https://github.com/aptos-foundation/AIPs/blob/main/aips/aip-115.md) covers stateless accounts.
## General FAQ
[Section titled “General FAQ”](#general-faq)
### What is a Stateless Account?
[Section titled “What is a Stateless Account?”](#what-is-a-stateless-account)
A Stateless Account is a new behavior for Aptos accounts that allows them to operate without requiring an explicitly created `0x1::account::Account` resource. Instead, these accounts use default behaviors until an action necessitates the creation of the resource. This change simplifies account management and reduces unnecessary resource creation, making it easier for developers and users to interact with the Aptos blockchain.
### How is it different from a regular account?
[Section titled “How is it different from a regular account?”](#how-is-it-different-from-a-regular-account)
Technically, there is no separate account type. All accounts are the same under the hood. The difference is that accounts without a resource behave in a “stateless” manner using default values. The account resource is only created on-demand when needed.
### How does it work?
[Section titled “How does it work?”](#how-does-it-work)
When an account signs its first transaction sequence number transaction, it will not have the `0x1::account::Account` resource created. Instead, it will create the `0x1::account::Account` resource only when an action that requires to increment the sequence number.
For an orderless transaction, the account resource is not needed at all, and the account resource will not be created.
## Technical Details FAQ
[Section titled “Technical Details FAQ”](#technical-details-faq)
### What is the default auth\_key for Stateless Accounts?
[Section titled “What is the default auth\_key for Stateless Accounts?”](#what-is-the-default-auth_key-for-stateless-accounts)
If the `0x1::account::Account` resource does not exist, the auth\_key defaults to the account address itself. This allows the account to sign and submit transactions without needing a resource.
### What is the sequence number of a Stateless Account?
[Section titled “What is the sequence number of a Stateless Account?”](#what-is-the-sequence-number-of-a-stateless-account)
It defaults to `0` if the account resource does not exist. In the future, with Orderless Transactions, the sequence number may be eliminated entirely.
### When is the account resource automatically created?
[Section titled “When is the account resource automatically created?”](#when-is-the-account-resource-automatically-created)
The resource is created when an action that requires on-chain state, such as:
* Rotating the authentication key
* Using capabilities or features that rely on the account resource such as sequence number
* Explicitly calling functions that access fields in the account resource
### Does creating the account resource incur extra gas cost?
[Section titled “Does creating the account resource incur extra gas cost?”](#does-creating-the-account-resource-incur-extra-gas-cost)
Yes. The creation of the resource is deferred, and the corresponding gas and storage fees are only charged at the moment of actual creation, not beforehand.
### Any behavior change to account module at the Move level?
[Section titled “Any behavior change to account module at the Move level?”](#any-behavior-change-to-account-module-at-the-move-level)
`0x1::account::exists_at` always returns true, as all on-chain account addresses are considered valid and treated as existing by default. There is no move function in the module to check whether the underlying account resource really exists since the goal is to make it transparent to users. As a result, any logic that first checks whether an account exists before attempting to create it is now obsolete.
### Can users force-create the account resource upfront?
[Section titled “Can users force-create the account resource upfront?”](#can-users-force-create-the-account-resource-upfront)
Yes. Users can explicitly call functions like `0x1::account::create_account_if_does_not_exist` to create the resource manually, if desired.
### Any behavior change to API?
[Section titled “Any behavior change to API?”](#any-behavior-change-to-api)
If you rely on the following API behavior, please adjust correspondingly. `GET /accounts/{address}` will never return “404 not found” but the default authentication key and sequence number mentioned above for stateless accounts. Therefore, if it is desired to check whether the account resource exists or not, try `GET /accounts/{address}/resource/0x1::account::Account`
### Do existing accounts get affected?
[Section titled “Do existing accounts get affected?”](#do-existing-accounts-get-affected)
No. Existing accounts with resources already created will continue to work exactly as they do now. Stateless Account behavior only applies to accounts that have not yet created a resource.
### Do dApps / CEX need to change anything?
[Section titled “Do dApps / CEX need to change anything?”](#do-dapps--cex-need-to-change-anything)
Maybe. Previously, checking whether an account existed often relied on calling APIs that return a 404 error if the account resource was not found. Applications would then use this as a signal to warn users (e.g., “This account does not exist”). Under the new model, all addresses are considered valid, and such 404-based existence checks are no longer reliable or meaningful. However, we are not banning this pattern—developers may still choose to warn users that an account may not have performed any on-chain activity and thus might not have a resource created yet.
If you still want to detect whether an account has an associated resource, you can refer to the method described in Q9 or check whether the sequence\_number is 0. But be aware that with the introduction of orderless transactions, some accounts may only submit transactions that never create a resource, which could result in false negatives.
We recommend designing your application to be robust regardless of whether the account resource exists, and to avoid assuming resource presence as a proxy for account existence.
Examples:
* A wallet might check for an account to see if it’s a new account, and provide a user a warning. With this change, instead a mitigation like Q9 will be needed.
* A custodial wallet may send funds to initialize an account with gas. With this change, it will need to check the account’s balance instead of just the account existing.
### Is this compatible with Orderless Transactions?
[Section titled “Is this compatible with Orderless Transactions?”](#is-this-compatible-with-orderless-transactions)
Yes. Orderless Transactions and Stateless Accounts are complementary. Once Orderless Transactions are enabled, sequence numbers will no longer be needed, enabling truly stateless usage.
## Will all accounts become Stateless in the future?
[Section titled “Will all accounts become Stateless in the future?”](#will-all-accounts-become-stateless-in-the-future)
No. Stateless Accounts are not a new account type. It simply allows accounts to behave with default logic until the account resource is needed. This lazy resource creation, does not transform existing account state. All accounts can behave in a stateless way by default, but they will still create the standard resource if and when advanced features are used.
# AIP-88 - Block Epilogue Transactions
[AIP-88](https://github.com/aptos-foundation/AIPs/blob/main/aips/aip-88.md) covers block epilogue transactions, which are a new type of transaction that give information about the block after it has been executed. These transactions can only be created by the consensus and are not user-initiated. They contain information about gas usage in the block and will contain more information in the future.
It replaces the previous `StateCheckpoint` transaction type, which was used to “sometimes” signal the end of a block. The new `BlockEpilogue` transaction is now sometimes created at the end of a block instead, and it is guaranteed to be the last transaction in the block. The only case this does not apply is the last block of an epoch, which will have no `BlockEpilogue` transaction.
## General FAQ
[Section titled “General FAQ”](#general-faq)
### What is in the Block Epilogue Transaction?
[Section titled “What is in the Block Epilogue Transaction?”](#what-is-in-the-block-epilogue-transaction)
The block epilogue transaction contains a `BlockEndInfo` enum. It is purposely designed to be an enum so that it can be extended in the future without breaking existing code. The current version is `V0` and contains the following fields:
```move
module 0x1::epilogue {
enum BlockEndInfo {
V0 {
/// Whether block gas limit was reached
block_gas_limit_reached: bool,
/// Whether block output limit was reached
block_output_limit_reached: bool,
/// Total gas_units block consumed
block_effective_block_gas_units: u64,
/// Total output size block produced
block_approx_output_size: u64,
},
}
}
```
These mainly contain information about the gas usage in the block for debugging purposes.
The JSON output will look like this:
```json
{
"version":"1912",
"hash":"0x54a8efc93fc94f5b545dadb63da3d4dc192125c717b336dc446d55a5b553913f",
"state_change_hash":"0xafb6e14fe47d850fd0a7395bcfb997ffacf4715e0f895cc162c218e4a7564bc6",
"event_root_hash":"0x414343554d554c41544f525f504c414345484f4c4445525f4841534800000000",
"state_checkpoint_hash":"0x841a43956ca09a02b1c1cdadc65f24c390170aa666015a2e8f7ec5c9d6a3875f",
"gas_used":"0",
"success":true,
"vm_status":"Executed successfully",
"accumulator_root_hash":"0x6561976b4560ff25239dffc6cada70e7008dd42fc4d3df2eca6a86b6d2ec384d",
"changes":[],
"timestamp":"1719263322836578",
"block_end_info": {
"block_gas_limit_reached":false,
"block_output_limit_reached":false,
"block_effective_block_gas_units":0,
"block_approx_output_size":1163
},
"type":"block_epilogue_transaction"
}
```
## Compatibility FAQ
[Section titled “Compatibility FAQ”](#compatibility-faq)
### What does this mean for my dApp?
[Section titled “What does this mean for my dApp?”](#what-does-this-mean-for-my-dapp)
If you process transactions in your dApp, and expect the last transaction in a block to be a `StateCheckpoint`, you will need to update your code to handle the `BlockEpilogue` transaction instead.
Note that, the `BlockEpilogue` transaction is guaranteed to be the last transaction of a block except for the last block of an epoch, which will not have a `BlockEpilogue` transaction.
### What apps are likely to be affected?
[Section titled “What apps are likely to be affected?”](#what-apps-are-likely-to-be-affected)
Apps that index all transactions such as block explorers and centralized exchange indexer processors may be affected. However, most of these are informational and do not affect the core functionality of the dApp.
### What can I do to process the new transaction type?
[Section titled “What can I do to process the new transaction type?”](#what-can-i-do-to-process-the-new-transaction-type)
If you’re using the Aptos Go SDK or the Aptos TypeScript SDK, you can update to the latest version, which will automatically handle the new transaction type.
# Aptos APIs
The Aptos Blockchain network can be accessed by several APIs, depending on your use-case.
## Aptos Fullnode
[Section titled “Aptos Fullnode”](#aptos-fullnode)
This API - embedded into Fullnodes - provides a simple, low latency, yet low-level way of *reading* state and *submitting* transactions to the Aptos Blockchain. It also supports transaction simulation.
[Aptos Fullnode REST API (Mainnet) ](/build/apis/fullnode-rest-api-reference?network=mainnet)Mainnet API playground for Aptos Fullnode REST API
[Aptos Fullnode REST API (Testnet) ](/build/apis/fullnode-rest-api-reference?network=testnet)Testnet API playground for Aptos Fullnode REST API
[Aptos Fullnode REST API (Devnet) ](/build/apis/fullnode-rest-api-reference?network=devnet)Devnet API playground for Aptos Fullnode REST API
## Indexer
[Section titled “Indexer”](#indexer)
[Indexer GraphQL API ](/build/indexer)This GraphQL API offers a high-level, opinionated GraphQL interface to read state from the Aptos Blockchain. It's ideal for interacting with NFTs, Aptos Objects, or custom Move contracts. Learn more about the Indexer-powered GraphQL API here.
[Transaction Stream API ](/build/indexer/txn-stream)This GRPC API streams historical and real-time transaction data to an indexing processor. It's used by Aptos Core Indexing and can also support custom app-specific indexing processors for real-time blockchain data processing. Learn more here.
## Faucet (Only Testnet/Devnet)
[Section titled “Faucet (Only Testnet/Devnet)”](#faucet-only-testnetdevnet)
[Faucet API ](/build/apis/faucet-api)This API provides the ability to receive test tokens on devnet. Its primary purpose is the development and testing of applications and Move contracts before deploying them to mainnet. On testnet you can mint at the mint page.
The code of each of the above-mentioned APIs is open-sourced on [GitHub](https://github.com/aptos-labs/aptos-core). As such anyone can operate these APIs and many independent operators and builders worldwide choose to do so.
### Aptos Labs operated API Deployments
[Section titled “Aptos Labs operated API Deployments”](#aptos-labs-operated-api-deployments)
[Aptos Labs](https://aptoslabs.com) operates a deployment of these APIs on behalf of [Aptos Foundation](https://aptosfoundation.org/) for each [Aptos Network](/network/nodes/networks) and makes them available for public consumption.
These APIs allow for limited access on a per-IP basis without an API key (anonymous access). To get much higher rate limits you can sign up for an [Aptos Build](https://build.aptoslabs.com/) account.
# Aptos Labs Aptos Build
[Aptos Build](https://build.aptoslabs.com) is your gateway to access Aptos Labs provided APIs in a quick and easy fashion to power your dapp. Beyond API access it offers gas station and no code indexing services.
Learn more about Aptos Build at the dedicated [Aptos Build docs site](https://build.aptoslabs.com/docs).
# Data Providers
If you want to access aptos blockchain data but don’t need it in real-time. We have a few options that will let you access this data using SQL or UIs for building dashboards. This type of data is often used for analytics since it allows for aggregations.
## Review of data endpoints
[Section titled “Review of data endpoints”](#review-of-data-endpoints)
Hitting the full node directly will give the latest data (will be missing historical unless it’s an archival full node) using [REST API](/build/apis#aptos-fullnode)
Indexer layer on top of this will provide a [GRPC transaction stream](/build/indexer/txn-stream/aptos-hosted-txn-stream)
On top of this transaction stream, we’ve built out some product logic tables that can be queried through [GraphQL](/build/indexer/)
Since the logic to parse out transaction is [public](https://github.com/aptos-labs/aptos-indexer-processors-v2), some vendors have implemented similar parsing logic to create a subset of tables and made them available to query.
## SQL Tables
[Section titled “SQL Tables”](#sql-tables)
Indexer defines several processors that create different database tables.
### Core tables
[Section titled “Core tables”](#core-tables)
These are parsed directly from node API response, one option is to split it out into the following tables:
* Blocks - version, block height, epoch, timestamp
* Transactions - version, sender, entry function, gas
* Signatures - signature types, signer, fee payer address
* Events - type and data for events
We store data as table items, resources or modules
* (write set) changes - change index, change type, resource address
* Table items - table key, table handle, key (content and type), value (content and type)
* (move) resources - resource address, resource type, data
* (move) modules - bytecode for deployed modules
## Vendors of off-chain data
[Section titled “Vendors of off-chain data”](#vendors-of-off-chain-data)
Most of our data vendors only provide core datasets. A [subset of vendors](https://aptosfoundation.org/currents/aptos-on-chain-data-capabilities-with-dune-nansen-and-other-providers) is listed below
### Google bigquery public dataset
[Section titled “Google bigquery public dataset”](#google-bigquery-public-dataset)
Provides data through [google public data](https://console.cloud.google.com/marketplace/product/bigquery-public-data/crypto-aptos-mainnet-us)

We also have sample analytics queries [using the above resources](https://github.com/aptos-labs/explorer/tree/main/analytics)
### Dune
[Section titled “Dune”](#dune)
We have a dashboard here:
### Allium
[Section titled “Allium”](#allium)
Data source for many downstream vendors such as defillama and rwa.xyz. Raw data is available: They also have transfers for stablecoins
### Artemis
[Section titled “Artemis”](#artemis)
Provides [topline metrics](https://app.artemis.xyz/asset/aptos) as well as chart builder
### Nansen
[Section titled “Nansen”](#nansen)
Provides [topline metrics](https://app.nansen.ai/macro/blockchains?chain=aptos) with additional functionality with account.
### Sentio
[Section titled “Sentio”](#sentio)
They have a guide here: Data is found in data source -> external project -> sentio/aptos-overview They also provide stack tracing of transactions
### RWA.xyz
[Section titled “RWA.xyz”](#rwaxyz)
Data can be found here: You’ll need to make an account to access stablecoin details.
### Flipside
[Section titled “Flipside”](#flipside)
Has pivoted from dashboard vendor to more of a vibe coding tool.
## Other vendors
[Section titled “Other vendors”](#other-vendors)
We also have some partners who target more enterprise use cases
* [Token Terminal](https://tokenterminal.com/resources/articles/aptos-data-partnership)
* [The Tie](https://www.thetie.io/insights/news/introducing-aptos-ecosystem-dashboard-and-on-chain-data/)
* [Elliptic](https://www.elliptic.co/media-center/elliptic-partners-with-aptos-foundation-as-a-data-integration-provider-to-offer-compliance-screening-and-risk-services-for-aptos-network)
# Faucet API
The faucet allows users to get `APT` on devnet. On testnet you can only mint at the [mint page](/network/faucet). It is not available on Mainnet.
The endpoints for each faucet are:
* Devnet:
## Using the faucet
[Section titled “Using the faucet”](#using-the-faucet)
Each SDK has integration for devnet to use the faucet. Below are a few examples, but you can see more information on each individual [SDK’s documentation](/build/sdks).
### Using the faucet in a wallet
[Section titled “Using the faucet in a wallet”](#using-the-faucet-in-a-wallet)
Most wallets, such as [Petra](https://aptosfoundation.org/ecosystem/project/petra) or [Pontem](https://aptosfoundation.org/ecosystem/project/pontem-wallet) will have a faucet button for devnet. See full list of [Aptos Wallets](https://aptosfoundation.org/ecosystem/projects/wallets).
### Using the faucet in the Aptos CLI
[Section titled “Using the faucet in the Aptos CLI”](#using-the-faucet-in-the-aptos-cli)
Once you’ve [set up your CLI](/build/cli/setup-cli), you can simply call fund-with-faucet. The amount used is in Octas (1 APT = 100,000,000 Octas).
```shellscript
aptos account fund-with-faucet --account 0xd0f523c9e73e6f3d68c16ae883a9febc616e484c4998a72d8899a1009e5a89d6 --amount 100000000
```
### Using the faucet in the TypeScript SDK
[Section titled “Using the faucet in the TypeScript SDK”](#using-the-faucet-in-the-typescript-sdk)
Here is an example funding the account `0xd0f523c9e73e6f3d68c16ae883a9febc616e484c4998a72d8899a1009e5a89d6` with 1 APT in Devnet. The amount used is in Octas (1 APT = 100,000,000 Octas).
```typescript
import { Aptos, AptosConfig, Network } from "@aptos-labs/ts-sdk";
const aptos = new Aptos(new AptosConfig({network: Network.Devnet}));
aptos.fundAccount({accountAddress: "0xd0f523c9e73e6f3d68c16ae883a9febc616e484c4998a72d8899a1009e5a89d6", amount: 100000000});
```
### Using the faucet in the Go SDK
[Section titled “Using the faucet in the Go SDK”](#using-the-faucet-in-the-go-sdk)
Here is an example funding the account `0xd0f523c9e73e6f3d68c16ae883a9febc616e484c4998a72d8899a1009e5a89d6` with 1 APT in Devnet. The amount used is in Octas (1 APT = 100,000,000 Octas).
```go
import "github.com/aptos-labs/aptos-go-sdk"
func main() {
client, err := aptos.NewClient(aptos.LocalnetConfig)
if err != nil {
panic(err)
}
client.Fund("0xd0f523c9e73e6f3d68c16ae883a9febc616e484c4998a72d8899a1009e5a89d6", 100000000)
}
```
### Calling the faucet: Other languages not supported by SDKs
[Section titled “Calling the faucet: Other languages not supported by SDKs”](#calling-the-faucet-other-languages-not-supported-by-sdks)
If you are trying to call the faucet in other languages, you have two options:
1. Generate a client from the [OpenAPI spec](https://github.com/aptos-labs/aptos-core/blob/main/crates/aptos-faucet/doc/spec.yaml).
2. Call the faucet on your own.
For the latter, you will want to build a query similar to this:
```shellscript
curl -X POST
'https://faucet.devnet.aptoslabs.com/mint?amount=10000&address=0xd0f523c9e73e6f3d68c16ae883a9febc616e484c4998a72d8899a1009e5a89d6'
```
This means mint 10000 [octas](/network/glossary#Octa) to address `0xd0f523c9e73e6f3d68c16ae883a9febc616e484c4998a72d8899a1009e5a89d6`.
# Fullnode Rest API
This API - embedded into Fullnodes - provides a simple, low latency, yet low-level way of reading state and submitting transactions to the Aptos Blockchain. It also supports transaction simulation. For more advanced queries, we recommend using the [Indexer GraphQL API](/build/indexer).
## Fullnode REST API Explorer
[Section titled “Fullnode REST API Explorer”](#fullnode-rest-api-explorer)
[Mainnet Fullnode REST API ](https://fullnode.mainnet.aptoslabs.com/v1/spec#/)REST API Explorer for Mainnet
[Testnet Fullnode REST API ](https://fullnode.testnet.aptoslabs.com/v1/spec#/)REST API Explorer for Testnet
[Devnet Fullnode REST API ](https://fullnode.devnet.aptoslabs.com/v1/spec#/)REST API Explorer for Devnet
## Understanding rate limits
[Section titled “Understanding rate limits”](#understanding-rate-limits)
As with the [Aptos Indexer](/build/indexer/indexer-api), the Aptos REST API has rate limits based on compute units. You can learn more about how the ratelimiting works by reading the [Aptos Build docs](https://build.aptoslabs.com/docs/start/billing).
## Viewing current and historical state
[Section titled “Viewing current and historical state”](#viewing-current-and-historical-state)
Most integrations into the Aptos blockchain benefit from a holistic and comprehensive overview of the current and historical state of the blockchain. Aptos provides historical transactions, state, and events, all the result of transaction execution.
* Historical transactions specify the execution status, output, and tie to related events. Each transaction has a unique version number associated with it that dictates its global sequential ordering in the history of the blockchain ledger.
* The state is the representation of all transaction outputs up to a specific version. In other words, a state version is the accumulation of all transactions inclusive of that transaction version.
* As transactions execute, they may emit events. [Events](/network/blockchain/events) are hints about changes in on-chain data.
Note
Ensure the [fullnode](/network/nodes/networks) you are communicating with is up-to-date. The fullnode must reach the version containing your transaction to retrieve relevant data from it. There can be latency from the fullnodes retrieving state from [validator fullnodes](/network/blockchain/fullnodes), which in turn rely upon [validator nodes](/network/blockchain/validator-nodes) as the source of truth.
The storage service on a node employs two forms of pruning that erase data from nodes:
* state
* events, transactions, and everything else
While either of these may be disabled, storing the state versions is not particularly sustainable.
Events and transactions pruning can be disabled via setting the [`enable_ledger_pruner`](https://github.com/aptos-labs/aptos-core/blob/cf0bc2e4031a843cdc0c04e70b3f7cd92666afcf/config/src/config/storage_config.rs#L141) to `false` in `storage_config.rs`. This is default behavior in Mainnet. In the near future, Aptos will provide indexers that mitigate the need to directly query from a node.
The REST API offers querying transactions and events in these ways:
* [Transactions for an account](https://api.devnet.aptoslabs.com/v1/spec#/operations/get_account_transactions)
* [Transactions by version](https://api.devnet.aptoslabs.com/v1/spec#/operations/get_transaction_by_version)
* [Events by event handle](https://api.devnet.aptoslabs.com/v1/spec#/operations/get_events_by_event_handle)
## Reading state with the View function
[Section titled “Reading state with the View function”](#reading-state-with-the-view-function)
View functions do not modify blockchain state when called from the API. A [View function](https://github.com/aptos-labs/aptos-core/blob/main/api/src/view_function.rs) and its [input](https://github.com/aptos-labs/aptos-core/blob/main/api/types/src/view.rs) can be used to read potentially complex on-chain state using Move. For example, you can evaluate who has the highest bid in an auction contract. Here are related files:
* [`view_function.rs`](https://github.com/aptos-labs/aptos-core/blob/main/api/src/tests/view_function.rs) for an example
* related [Move](https://github.com/aptos-labs/aptos-core/blob/90c33dc7a18662839cd50f3b70baece0e2dbfc71/aptos-move/framework/aptos-framework/sources/coin.move#L226) code
* [specification](https://github.com/aptos-labs/aptos-core/blob/90c33dc7a18662839cd50f3b70baece0e2dbfc71/api/doc/spec.yaml#L8513).
The view function operates like the Aptos simulation API, though with no side effects and an accessible output path. View functions can be called via the `/view` endpoint. Calls to view functions require the module and function names along with input type parameters and values.
A function does not have to be immutable to be tagged as `#[view]`, but if the function is mutable it will not result in state mutation when called from the API. If you want to tag a mutable function as `#[view]`, consider making it private so that it cannot be maliciously called during runtime.
In order to use the View functions, you need to [publish the module](/build/cli/working-with-move-contracts) through the [Aptos CLI](/build/cli).
In the Aptos CLI, a view function request would look like this:
```shellscript
aptos move view --function-id devnet::message::get_message --profile devnet --args address:devnet
{
"Result": [
"View functions rock!"
]
}
```
In the TypeScript SDK, a view function request would look like this:
```typescript
import { Aptos } from "@aptos-labs/ts-sdk";
const aptos = new Aptos();
const [balance] = aptos.view<[string]>({
function: "0x1::coin::balance",
typeArguments: ["0x1::aptos_coin::AptosCoin"],
functionArguments: [alice.accountAddress]
});
expect(balance).toBe("100000000");
```
The view function returns a list of values as a vector. By default, the results are returned in JSON format; however, they can be optionally returned in Binary Canonical Serialization (BCS) encoded format.
# Fullnode API Reference
# CLI
The Aptos command line interface (CLI) is a tool to help you compile and test Move contracts. It can also help you quickly play with Aptos features on-chain.
For more advanced users, the CLI can also be used to run a private Aptos network (to help test code locally) and can be helpful managing a network node.
## 📥 Install the Aptos CLI
[Section titled “📥 Install the Aptos CLI”](#-install-the-aptos-cli)
[Mac ](/build/cli/install-cli/install-cli-mac)Install Aptos CLI via homebrew
[Windows ](/build/cli/install-cli/install-cli-windows)Install Aptos CLI on Windows via powershell script or pre-compiled binary
[Linux ](/build/cli/install-cli/install-cli-linux)Install Aptos CLI on Linux via shell script or pre-compiled binary
[Advanced (Install Specific Versions) ](/build/cli/install-cli/install-cli-specific-version)Build a specific version of the Aptos CLI from source
## ⚙️ Setup the Aptos CLI
[Section titled “⚙️ Setup the Aptos CLI”](#️-setup-the-aptos-cli)
[Setup the CLI ](/build/cli/setup-cli)Setup and configure the Aptos CLI
[Advanced (Move Prover) ](/build/cli/setup-cli/install-move-prover)Setup and install the Move Prover
## 🛠️ Using the Aptos CLI
[Section titled “🛠️ Using the Aptos CLI”](#️-using-the-aptos-cli)
[Move Contracts ](/build/cli/working-with-move-contracts)Compile, Publish, Simulate, and Benchmark Move Contracts
[Trying things On-chain ](/build/cli/trying-things-on-chain)Interact with Aptos, create accounts, query accounts, use a hardware device like Ledger
[Running a Local Network ](/build/cli/running-a-local-network)Run a local node / network
# Formatting Move Contracts
`movefmt` is a formatter tool that makes Move code much easier to write, read, and maintain — greatly improving the development experience on Aptos.
## Installation
[Section titled “Installation”](#installation)
`movefmt` is integrated into the Aptos CLI. To begin using it, first install it using the CLI update command.
```shellscript
# Install movefmt for first time usage
aptos update movefmt
```
To install a specific version of `movefmt`:
```shellscript
# Install movefmt with the target
aptos update movefmt --target-version
```
The latest release of `movefmt` can be found [here](https://github.com/movebit/movefmt/releases).
## Format your code
[Section titled “Format your code”](#format-your-code)
Similar to compilation and testing, you can use the following command to format the Move package:
```shellscript
# Format the Move package
aptos move fmt
```
Different ways of emitting the formatting result is supported:
```shellscript
# Format and overwrite all the target move files in the package.
# This is the default behavior if `--emit-mode` is not explicitly specified
aptos move fmt --emit-mode=overwrite
# Print the formatting result to terminal
aptos move fmt --emit-mode=std-out
# Print the formatting result to new files with the suffix `.fmt.out` in the same directory
aptos move fmt --emit-mode=new-file
# Print the difference between before and after formatting
aptos move fmt --emit-mode=diff
```
`movefmt` also provides different options to configure how the code will be formatted. Here is the default configuration:
```plaintext
max_width = 90 # each line can have at most 90 characters
indent_size = 4 # the indent is 4 spaces
tab_spaces = 4 # each tab is identical to 4 spaces
hard_tabs = false # when a tab is inserted, it will be automatically replaced by 4 spaces
```
To override the default option, users can either specify a configuration file `movefmt.toml` and put it in Move package directory or manually specify it in the command line:
```shellscript
# When formatting the code, set `max_width` to 80 and `indent_size` to 2
aptos move fmt --config max_width=80,indent_size=2
```
## Feedback
[Section titled “Feedback”](#feedback)
Aptos Labs remains committed to improving the developer experience for builders using Move on Aptos. If you’re interested in shaping the style guidelines for Move, we would love to hear your comments and feedback [here](https://github.com/movebit/movefmt/issues).
# Install the Aptos CLI on Linux
For Linux, the easiest way to install the Aptos CLI tool is via shell script, although if that does not work, you can also install manually via downloading pre-compiled binaries. The pre-compiled binaries approach is not generally recommended as updating is very manual.
# Install via Script
[Section titled “Install via Script”](#install-via-script)
1. In the terminal, use one of the following commands:
```shellscript
curl -fsSL "https://aptos.dev/scripts/install_cli.sh" | sh
```
Or use the equivalent `wget` command:
```shellscript
wget -qO- "https://aptos.dev/scripts/install_cli.sh" | sh
```
Caution
If you are getting `Illegal instruction` errors when running the CLI, it may be due to your CPU not supporting SIMD instructions. Specifically for older non-SIMD processors or Ubuntu x86\_64 docker containers on ARM Macs, you may need to run the following command instead to skip SIMD instructions:
```shellscript
curl -fsSL "https://aptos.dev/scripts/install_cli.sh" | sh -s -- --generic-linux
```
2. (Optional) It can be helpful to add the Aptos CLI to a folder in your PATH, or to add it to your PATH directly.
* The steps to add a folder to your PATH are shell dependent.
* You can run `echo $SHELL` to print the default shell for your machine, then google specific steps to add a folder to your PATH for that shell.
3. Verify the script is installed by opening a new terminal and running aptos help
* You should see a list of commands you can run using the CLI.
* In the future, this is a helpful resource to learn exactly how each command works.
Note
If you would like to update the Aptos CLI to the latest version, you can run `aptos update`.
# Install via Package Manager (Optional)
[Section titled “Install via Package Manager (Optional)”](#install-via-package-manager-optional)
Note
When installing Aptos via a package manager, please update it through the same package manager in the future.
### Arch Linux
[Section titled “Arch Linux”](#arch-linux)
#### Install via AUR (Arch User Repository)
[Section titled “Install via AUR (Arch User Repository)”](#install-via-aur-arch-user-repository)
```shellscript
git clone https://aur.archlinux.org/aptos-bin.git
cd aptos-bin
makepkg -si
```
or use an AUR helper like `yay`:
```shellscript
yay -S aptos-bin
```
# Install via Pre-Compiled Binaries (Backup Method)
[Section titled “Install via Pre-Compiled Binaries (Backup Method)”](#install-via-pre-compiled-binaries-backup-method)
1. Go to the .
2. Click the “Assets” expandable menu for the latest release to see the pre-compiled binaries.
3. Download the zip file for Linux.
1. It’ll have a name like: `aptos-cli--Linux-x86_64.zip` or `aptos-cli--Linux-aarch64.zip`.
2. Make sure you choose the right zip file for your computer architecture (x86\_64 for Intel / AMD or aarch64 for ARM).
3. You will likely have to dismiss warnings that this is a suspicious file when downloading.
4. Unzip the downloaded file.
5. Move the extracted Aptos binary file into your preferred folder.
6. Open a terminal and navigate to your preferred folder.
7. Make \~/aptos an executable by running chmod +x \~/aptos.
8. Verify that this installed version works by running \~/aptos help.
You should see instructions for how to use all CLI commands. These can be helpful in the future when you are trying to understand how to use specific commands.
9. (Optional) It can be helpful to add the Aptos CLI to a folder in your PATH, or to add it to your PATH directly.
* The steps to add a folder to your PATH are shell dependent.
* You can run `echo $SHELL` to print the default shell for your machine, then google specific steps to add a folder to your PATH for that shell.
Note
When using the pre-compiled binaries method, you can update the Aptos CLI by deleting your existing installation, then following the installation steps again.
# Install the Aptos CLI on Mac
For Mac, the easiest way to install the Aptos CLI is with the package manager `brew`.
# Installation
[Section titled “Installation”](#installation)
1. Ensure you have brew installed .
2. Open a new terminal and enter the following commands.
```shellscript
brew update
brew install aptos
```
3. Open another terminal and run aptos help to verify the CLI is installed.
```shellscript
aptos help
```
Caution
If `brew` does not work for you, you can try the steps here: [Install via Script](#install-via-script) or [Install via Pre-Compiled Binaries](#install-via-pre-compiled-binaries-backup-method).)
# Upgrading the CLI
[Section titled “Upgrading the CLI”](#upgrading-the-cli)
Upgrading the CLI with brew just takes 2 commands:
```shellscript
brew update
brew upgrade aptos
```
# Install via Script
[Section titled “Install via Script”](#install-via-script)
1. In the terminal, use one of the following commands:
```shellscript
curl -fsSL "https://aptos.dev/scripts/install_cli.sh" | sh
```
Or use the equivalent `wget` command:
```shellscript
wget -qO- "https://aptos.dev/scripts/install_cli.sh" | sh
```
2. (Optional) It can be helpful to add the Aptos CLI to a folder in your PATH, or to add it to your PATH directly.
* The steps to add a folder to your PATH are shell dependent.
* You can run `echo $SHELL` to print the default shell for your machine, then google specific steps to add a folder to your PATH for that shell.
3. Verify the script is installed by opening a new terminal and running aptos help
* You should see a list of commands you can run using the CLI.
* In the future, this is a helpful resource to learn exactly how each command works.
Note
If you would like to update the Aptos CLI to the latest version, you can run `aptos update`.
# Install via Pre-Compiled Binaries (Backup Method)
[Section titled “Install via Pre-Compiled Binaries (Backup Method)”](#install-via-pre-compiled-binaries-backup-method)
1. Go to the .
2. Click the “Assets” expandable menu for the latest release to see the pre-compiled binaries.
3. Download the zip file for macOS.
1. It’ll have a name like: `aptos-cli--macOS-x86_64.zip` or `aptos-cli--macOS-arm64.zip`.
2. Make sure you choose the right zip file for your computer architecture (x86\_64 for Intel / AMD or arm64 for ARM).
3. You will likely have to dismiss warnings that this is a suspicious file when downloading.
4. Unzip the downloaded file.
5. Move the extracted Aptos binary file into your preferred folder.
6. Open a terminal and navigate to your preferred folder.
7. Make \~/aptos an executable by running chmod +x \~/aptos.
8. Verify that this installed version works by running \~/aptos help.
You should see instructions for how to use all CLI commands. These can be helpful in the future when you are trying to understand how to use specific commands.
9. (Optional) It can be helpful to add the Aptos CLI to a folder in your PATH, or to add it to your PATH directly.
* The steps to add a folder to your PATH are shell dependent.
* You can run `echo $SHELL` to print the default shell for your machine, then google specific steps to add a folder to your PATH for that shell.
Note
When using the pre-compiled binaries method, you can update the Aptos CLI by deleting your existing installation, then following the installation steps again.
# Install Specific Aptos CLI Versions (Advanced)
If you need a specific version of the Aptos CLI, you can build it directly from the Aptos source code. This installation method is primarily used to interact with specific features on Devnet which may not have made it to Testnet / Mainnet yet. You may also want to follow these steps if you are running an architecture which does not play well with the existing releases / pre-compiled binaries.
If you do not need this advanced method, you can find the normal install steps [here](/build/cli).
## Install on macOS / Linux
[Section titled “Install on macOS / Linux”](#install-on-macos--linux)
1. Follow the steps to .
2. Ensure you have cargo installed by following the steps on .
3. Build the CLI tool: cargo build —package aptos —profile cli.
The binary will be available at `target/cli/aptos`.
4. (Optional) Move this executable to a place in your PATH.
5. Verify the installation worked by running target/cli/aptos help.
These help instructions also serve as a useful detailed guide for specific commands.
## Install on Windows
[Section titled “Install on Windows”](#install-on-windows)
1. Follow the steps to build Aptos from source .
2. Ensure you have cargo installed by following the steps on .
3. Build the CLI tool: cargo build —package aptos —profile cli.
The binary will be available at `target\cli\aptos.exe`.
4. (Optional) Move this executable to a place in your PATH.
5. Verify the installation worked by running target\cli\aptos.exe help.
These help instructions also serve as a useful detailed guide for specific commands.
# Install the Aptos CLI on Windows
For Windows, the easiest way to install the Aptos CLI tool is via PowerShell script. If that does not work, you can also install manually via pre-compiled binaries. The pre-compiled binaries approach is not generally recommended as updating is very manual.
# Install via PowerShell Script
[Section titled “Install via PowerShell Script”](#install-via-powershell-script)
1. In PowerShell, run the install script:
```powershell
Set-ExecutionPolicy RemoteSigned -Scope CurrentUser; iwr https://aptos.dev/scripts/install_cli.ps1 | iex
```
2. Verify the script is installed by opening a new terminal and running aptos help.
* You should see a list of commands you can run using the CLI.
* In the future, this is a helpful resource to learn exactly how each command works.
Note
If you would like to update the Aptos CLI to the latest version via script, you can run `aptos update`.
# Install via Package Manager (Optional)
[Section titled “Install via Package Manager (Optional)”](#install-via-package-manager-optional)
Note
When installing Aptos via a package manager, please update it through the same package manager in the future.
### If you have [Scoop](https://scoop.sh/) installed, you can run the following command to install the Aptos CLI:
[Section titled “If you have Scoop installed, you can run the following command to install the Aptos CLI:”](#if-you-have-scoop-installed-you-can-run-the-following-command-to-install-the-aptos-cli)
```powershell
scoop install https://aptos.dev/scoop/aptos.json
```
### If you have [Chocolatey](https://chocolatey.org/) installed, you can run the following command to install the Aptos CLI:
[Section titled “If you have Chocolatey installed, you can run the following command to install the Aptos CLI:”](#if-you-have-chocolatey-installed-you-can-run-the-following-command-to-install-the-aptos-cli)
```powershell
choco install aptos
```
### If you have [winget](https://winget.run/) installed, you can run the following command to install the Aptos CLI:
[Section titled “If you have winget installed, you can run the following command to install the Aptos CLI:”](#if-you-have-winget-installed-you-can-run-the-following-command-to-install-the-aptos-cli)
```powershell
winget install aptos
```
# Install via Pre-Compiled Binaries (Backup Method)
[Section titled “Install via Pre-Compiled Binaries (Backup Method)”](#install-via-pre-compiled-binaries-backup-method)
1. Go to the .
2. Expand “Assets” to see the pre-compiled binaries.
3. Download the zip file for Windows.
* It will have a name like: `aptos-cli--Windows-x86_64.zip`
* You will likely have to dismiss warnings that this is a suspicious file when downloading.
4. Unzip the downloaded file.
* Move the file to whichever folder you would like to call `aptos` from in the future.
5. Right click, then copy the path to the executable.
Ex. `C:\Users\\Downloads\aptos-cli-3.1.0-Windows-x86_64\aptos.exe`.
Note
You may want to add this path to your PATH environment variable to simplify calling the Aptos CLI going forward.
6. Open PowerShell via the Start Menu.
7. Verify the installation by running the help command.
Use the path you copied earlier to call the Aptos CLI. Ex. `C:\Users\\Downloads\aptos-cli-3.1.0-Windows-x86_64\aptos.exe help`.
Note
When installing with pre-compiled binaries, you can update the Aptos CLI by deleting your existing installation, then following the installation steps again.
Caution
If neither of the above methods work, you will have to build the CLI from source by following these steps: [Install Specific Aptos CLI Versions (Advanced)](/build/cli/install-cli/install-cli-specific-version)
# Managing a Network Node via Aptos CLI
If you are running a [validator node or validator full node (VFN)](/network/nodes/validator-node), you can use the CLI to interact with your node.
Specifically, you can use the CLI to:
1. [Manage staking pools you own](/network/nodes/validator-node/connect-nodes/staking-pool-operations).
2. [Vote on proposals](/network/nodes/validator-node/connect-nodes/staking-pool-voter).
Beyond that, you can run this help command to see more specialized commands the CLI can do relating to operating your node:
```shellscript
aptos node --help
```
# Running a Public Network (Advanced)
Caution
If you just want to run your own local network for testing, you can learn how to do that [here](/build/cli/running-a-local-network).
## Genesis ceremonies
[Section titled “Genesis ceremonies”](#genesis-ceremonies)
The `aptos` tool supports bootstrapping new blockchains through what is known as a genesis ceremony. The output of the genesis ceremony is the output of move instructions that prepares a blockchain for online operation. The input consists of:
* A set of validators and their configuration
* The initial set of Move modules, known as a framework
* A unique `ChainId` (u8) that distinguishes this from other networks
* For test chains, there also exists an account that manages the minting of AptosCoin
## Generating genesis
[Section titled “Generating genesis”](#generating-genesis)
* The genesis organizer constructs a `Layout` and distributes it.
* The genesis organizer prepares the Aptos framework’s bytecode and distributes it.
* Each participant generates their `ValidatorConfiguration` and distributes it.
* Each participant generates a `genesis.blob` from the resulting contributions.
* The genesis organizer executes the `genesis.blob` to derive the initial waypoint and distributes it.
* Each participant begins their `aptos-node`. The `aptos-node` verifies upon startup that the `genesis.blob` with the waypoint provided by the genesis organizer.
* The blockchain will begin consensus after a quorum of stake is available.
### Prepare aptos-core
[Section titled “Prepare aptos-core”](#prepare-aptos-core)
The following sections rely on tools from the Aptos source. See [Building Aptos From Source](/network/nodes/building-from-source) for setup.
### The `layout` file
[Section titled “The layout file”](#the-layout-file)
The layout file contains:
* `root_key`: an Ed25519 public key for AptosCoin management.
* `users`: the set of participants
* `chain_id`: the `ChainId` or a unique integer that distinguishes this deployment from other Aptos networks
An example:
```yaml
root_key: "0xca3579457555c80fc7bb39964eb298c414fd60f81a2f8eedb0244ec07a26e575"
users:
- alice
- bob
chain_id: 8
```
### Building the Aptos Framework
[Section titled “Building the Aptos Framework”](#building-the-aptos-framework)
From your Aptos-core repository, build the framework and package it:
```shellscript
cargo run --package framework
mkdir aptos-framework-release
cp aptos-framework/releases/artifacts/current/build/**/bytecode_modules/* aptos-framework-release
```
The framework will be stored within the `aptos-framework-release` directory.
### The `ValidatorConfiguration` file
[Section titled “The ValidatorConfiguration file”](#the-validatorconfiguration-file)
The `ValidatorConfiguration` file contains:
* `account_address`: The account that manages this validator. This must be derived from the `account_key` provided within the `ValidatorConfiguration` file.
* `consensus_key`: The public key for authenticating consensus messages from the validator
* `account_key`: The public key for the account that manages this validator. This is used to derive the `account_address`.
* `network_key`: The public key for both validator and fullnode network authentication and encryption.
* `validator_host`: The network address where the validator resides. This contains a `host` and `port` field. The `host` should either be a DNS name or an IP address. Currently only IPv4 is supported.
* `full_node_host`: An optional network address where the fullnode resides. This contains a `host` and `port` field. The `host` should either be a DNS name or an IP address. Currently only IPv4 is supported.
* `stake_amount`: The number of coins being staked by this node. This is expected to be `1`, if it is different the configuration will be considered invalid.
An example:
```yaml
account_address: ccd49f3ea764365ac21e99f029ca63a9b0fbfab1c8d8d5482900e4fa32c5448a
consensus_key: "0xa05b8f41057ac72f9ca99f5e3b1b787930f03ba5e448661f2a1fac98371775ee"
account_key: "0x3d15ab64c8b14c9aab95287fd0eb894aad0b4bd929a5581bcc8225b5688f053b"
network_key: "0x43ce1a4ac031b98bb1ee4a5cd72a4cca0fd72933d64b22cef4f1a61895c2e544"
validator_host:
host: bobs_host
port: 6180
full_node_host:
host: bobs_host
port: 6182
stake_amount: 1
```
To generate this using the `aptos` CLI:
1. Generate your validator’s keys:
```shellscript
cargo run --package aptos -- genesis generate-keys --output-dir bobs
```
2. Generate your `ValidatorConfiguration`:
```shellscript
cargo run --package aptos -- \\
genesis set-validator-configuration \\
--keys-dir bobs \\
--username bob \\
--validator-host bobs_host:6180 \\
--full-node-host bobs_host:6180 \\
--local-repository-dir .
```
3. The last command will produce a `bob.yaml` file that should be distributed to other participants for `genesis.blob` generation.
### Generating a genesis and waypoint
[Section titled “Generating a genesis and waypoint”](#generating-a-genesis-and-waypoint)
`genesis.blob` and the waypoint can be generated after obtaining the `layout` file, each of the individual `ValidatorConfiguration` files, and the framework release. It is important to validate that the `ValidatorConfiguration` provided in the earlier stage is the same as in the distribution for generating the `genesis.blob`. If there is a mismatch, inform all participants.
To generate the `genesis.blob` and waypoint:
* Place the `layout` file in a directory, e.g., `genesis`.
* Place all the `ValidatorConfiguration` files into the `genesis` directory.
* Ensure that the `ValidatorConfiguration` files are listed under the set of `users` within the `layout` file.
* Make a `framework` directory within the `genesis` directory and place the framework release `.mv` files into the `framework` directory.
* Use the `aptos` CLI to generate genesis and waypoint:
```shellscript
cargo run --package aptos -- genesis generate-genesis --local-repository-dir genesis
```
### Starting an `aptos-node`
[Section titled “Starting an aptos-node”](#starting-an-aptos-node)
Upon generating the `genesis.blob` and waypoint, place them into your validator and fullnode’s configuration directory and begin your validator and fullnode.
# Replaying Past Transactions
## Basics
[Section titled “Basics”](#basics)
You can replay past transactions locally using the `aptos move replay` command. The command is fairly straightforward but it requires you to specify two pieces of required information:
* `--network`
* This is the network you want to replay on
* Possible values: `mainnet`, `testnet`, `devnet` or ``
* `--txn-id`
* This is the id of the transaction you want to replay
* This is also sometimes being referred to as `version` on explorers
* Specifically it is NOT the hexadecimal transaction hash
Let’s use mainnet transaction [581400718](https://explorer.aptoslabs.com/txn/581400718?network=mainnet) (a simple coin transfer transaction) as an example.
```shellscript
aptos move replay --network mainnet --txn-id 581400718
```
Output
```shellscript
Got 1/1 txns from RestApi.
Replaying transaction...
{
"Result": {
"transaction_hash": "0x1ba73d03a0442a845735a17c7be46f3b51e2acb0e5cf68749305c5a17539ac63",
"gas_used": 7,
"gas_unit_price": 100,
"sender": "c94e16736910cc160347d01de345407fe2d350fce5635ac1150319b0fbf5630e",
"sequence_number": 14637,
"success": true,
"version": 581400718,
"vm_status": "status EXECUTED of type Execution"
}
}
```
Alternatively, if you want to simulate a new transaction, check out [Local Simulation, Benchmarking and Gas Profiling](/build/cli/working-with-move-contracts/local-simulation-benchmarking-and-gas-profiling).
## Alternate Modes
[Section titled “Alternate Modes”](#alternate-modes)
Similar to local simulations, the replay command can be enhanced with one of the following options:
* `--benchmark`: Benchmark the transaction and report the running time(s).
* `--profile-gas` Profile the transaction for detailed gas usage.
### Benchmarking
[Section titled “Benchmarking”](#benchmarking)
```shellscript
aptos move replay --network mainnet --txn-id 581400718 --benchmark
```
Output
```shellscript
Got 1/1 txns from RestApi.
Benchmarking transaction...
Running time (cold code cache): 914.821µs
Running time (warm code cache): 820.189µs
{
"Result": {
"transaction_hash": "0x1ba73d03a0442a845735a17c7be46f3b51e2acb0e5cf68749305c5a17539ac63",
"gas_used": 7,
"gas_unit_price": 100,
"sender": "c94e16736910cc160347d01de345407fe2d350fce5635ac1150319b0fbf5630e",
"sequence_number": 14637,
"success": true,
"version": 581400718,
"vm_status": "status EXECUTED of type Execution"
}
}
```
It’s worth noting that these running times serve only as informational references, as they are contingent upon the specifications of your local machine and may be influenced by noise or other random factors.
**If you are aiming to optimize your contract, you should base your decisions on the gas profiling results.**
Note
To minimize measurement errors, the benchmark harness executes the same transaction multiple times. For this reason, it may take a while for the benchmark task to complete.
### Gas Profiling
[Section titled “Gas Profiling”](#gas-profiling)
The Aptos Gas Profiler is a powerful tool that can help you understand the gas usage of Aptos transactions. Once activated, it will simulate transactions using an instrumented VM, and generate a web-based report.
The gas profiler can also double as a debugger since the report also includes a full execution trace.
```shellscript
aptos move replay --network mainnet --txn-id 581400718 --profile-gas
```
Output
```shellscript
Got 1/1 txns from RestApi.
Profiling transaction...
Gas report saved to gas-profiling/txn-1ba73d03-0x1-aptos_account-transfer.
{
"Result": {
"transaction_hash": "0x1ba73d03a0442a845735a17c7be46f3b51e2acb0e5cf68749305c5a17539ac63",
"gas_used": 7,
"gas_unit_price": 100,
"sender": "c94e16736910cc160347d01de345407fe2d350fce5635ac1150319b0fbf5630e",
"sequence_number": 14637,
"success": true,
"version": 581400718,
"vm_status": "status EXECUTED of type Execution"
}
}
```
You can then find the [generated gas report](/gas-profiling/sample-report-2/index.html) in the directory gas-profiling:
* gas-profiling/
* txn-1ba73d03-0x1-aptos\_account-transfer/
* assets/
* …
* index.html
To understand the gas report, please refer to [this section](/build/cli/working-with-move-contracts/local-simulation-benchmarking-and-gas-profiling#understanding-the-gas-report) of the local simulation tutorial.
# Running a Local Network via Aptos CLI
Local networks can be helpful when testing your code. They are not connected to any production Aptos networks like mainnet, but they are useful for three main reasons:
1. **No rate limits:** You can interact with hosted services like the Node API, Indexer API, and faucet with no rate-limits to speed up testing.
2. **Reproducibility:** You can set up specific on-chain scenarios and restart the network from scratch at any point to return to a clean slate.
3. **High availability**: The Aptos devnet and testnet networks are periodically upgraded, during which time they can be unavailable. Local development networks are also always available even if you have no internet access.
# Starting A Local Network
[Section titled “Starting A Local Network”](#starting-a-local-network)
1. Ensure you have the installed.
2. Ensure you have installed.
1. This is exclusively needed for making a production-like environment by running the Indexer API. Many downstream tools such as the Aptos SDK depend on the Indexer API.
2. Docker recommends that you install via [Docker Desktop](https://www.docker.com/products/docker-desktop/) to get automatic updates.
3. Start Docker.
4. Run the following command in a new terminal to start the private network:
```shellscript
aptos node run-local-testnet --with-indexer-api
```
Caution
Note: Despite the name (`local-testnet`), this has nothing to do with the Aptos testnet, it will run a network entirely local to your machine.
You should expect to see an output similar to this:
```shellscript
Readiness endpoint: http://0.0.0.0:8070/
Indexer API is starting, please wait...
Node API is starting, please wait...
Transaction stream is starting, please wait...
Postgres is starting, please wait...
Faucet is starting, please wait...
Completed generating configuration:
Log file: "/Users/dport/.aptos/testnet/validator.log"
Test dir: "/Users/dport/.aptos/testnet"
Aptos root key path: "/Users/dport/.aptos/testnet/mint.key"
Waypoint: 0:397412c0f96b10fa3daa24bfda962671c3c3ae484e2d67ed60534750e2311f3d
ChainId: 4
REST API endpoint: http://0.0.0.0:8080
Metrics endpoint: http://0.0.0.0:9101/metrics
Aptosnet fullnode network endpoint: /ip4/0.0.0.0/tcp/6181
Indexer gRPC node stream endpoint: 0.0.0.0:50051
Aptos is running, press ctrl-c to exit
Node API is ready. Endpoint: http://0.0.0.0:8080/
Postgres is ready. Endpoint: postgres://postgres@127.0.0.1:5433/local_testnet
Transaction stream is ready. Endpoint: http://0.0.0.0:50051/
Indexer API is ready. Endpoint: http://127.0.0.1:8090/
Faucet is ready. Endpoint: http://127.0.0.1:8081/
Applying post startup steps...
Setup is complete, you can now use the local testnet!
```
5. Wait for the network to start
Once the terminal says `Setup is complete, you can now use the local testnet!` the local network will be running.
Caution
If you ran into an error, look at the common errors below to debug.
Common Errors On Network Startup
### Address Already In Use
[Section titled “Address Already In Use”](#address-already-in-use)
```shellscript
panicked at 'error binding to 0.0.0.0:8080: error creating server listener: Address already in use (os error 48)'
```
This means one of the ports needed by the local network is already in use by another process.
To fix this on Unix systems, you can:
1. Identify the name and PID of the process by running `lsof -i :8080`.
2. Run `kill ` once you know the PID to free up that port.
### Too many open files error
[Section titled “Too many open files error”](#too-many-open-files-error)
```shellscript
panicked at crates/aptos/src/node/local_testnet/logging.rs:64:10:
called \`Result::unwrap()\` on an \`Err\` value: Os { code: 24, kind: Uncategorized, message: \"Too many open files\" }
```
This means there were too many open files on your system. On many Unix systems you can increase the maximum number of open files by adding something like this to your `.zshrc`:
```shellscript
ulimit -n 1048576
```
### Docker is not available
[Section titled “Docker is not available”](#docker-is-not-available)
```shellscript
Unexpected error: Failed to apply pre-run steps for Postgres: Docker is not available, confirm it is installed and running. On Linux you may need to use sudo
```
To debug this, try the below fixes:
1. Make sure you have docker installed by running `docker --version`.
2. Ensure the Docker daemon is running by running `docker info` (if this errors saying `Cannot connect to the Docker daemon` Docker is NOT running).
3. Make sure the socket for connecting to Docker is present on your machine in the default location. For example, on Unix systems `/var/run/docker.sock` should exist.
1. If that file does not exist, open Docker Desktop and enable `Settings -> Advanced -> Allow the default Docker socket to be used`.
2. Or, you can find where the Docker socket is by running `docker context inspect | grep Host`, then symlink that location to the default location by running `sudo ln -s /Users/dport/.docker/run/docker.sock /var/run/docker.sock`
As you can see from the example output in step 4, once the local network is running, you have access to the following services:
* [Node API](/build/apis/fullnode-rest-api): This is a REST API that runs directly on the node. It enables core write functionality such as transaction submission and a limited set of read functionality, such as reading account resources or Move module information.
* [Indexer API](/build/indexer/indexer-api): This is a [GraphQL](https://graphql.org/) API that provides rich read access to indexed blockchain data. If you click on the URL for the Indexer API above, by default [http://127.0.0.1:8090](http://127.0.0.1:8090/), it will open the Hasura Console, a web UI that will help you query the Indexer GraphQL API.
* [Transaction Stream Service](/build/indexer/txn-stream): This is a gRPC stream of transactions used by the Indexer API and SDK. This is only relevant to you if you are developing a [Indexer SDK](/build/indexer/indexer-sdk) custom processor.
* [Postgres](https://www.postgresql.org/): This is the database that the Indexer processors write to. The Indexer API reads from this database.
* [Faucet](/build/apis/faucet-api): You can use this to fund accounts on your local network.
If you do not want to run any of these sub-components of a network, there are flags to disable them.
If you are writing a script and would like to wait for the local network to come up with all services, you can make a GET request to `http://127.0.0.1:8070`. At first this will return http code [503](https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/503). When it returns [200](https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/200) it means all the services are ready.
For more information on different flags you can pass when starting your local network, or configuration settings such as changing which port certain services run on, run the help command:
```shellscript
aptos node run-local-testnet --help
```
## Using The Local Network
[Section titled “Using The Local Network”](#using-the-local-network)
Now that the network is running, you can use it like you would any other network.
So, you can create a local profile like this:
```shellscript
aptos init --profile --network local
```
You can then use that profile for any commands you want to use going forward. For example, if you wanted to publish a Move module like the [`hello_blockchain`](https://github.com/aptos-labs/aptos-core/tree/main/aptos-move/move-examples/hello_blockchain) package to your local network you could run:
```shellscript
aptos move publish --profile --package-dir /opt/git/aptos-core/aptos-move/move-examples/hello_blockchain --named-addresses HelloBlockchain=local
```
### Configuring the TypeScript SDK
[Section titled “Configuring the TypeScript SDK”](#configuring-the-typescript-sdk)
If you want to use the local network with the TypeScript SDK, you can use local network URLs when initializing the client object (`Aptos`):
```tsx
import { Aptos, AptosConfig, Network } from "@aptos-labs/ts-sdk";
const network = Network.LOCAL;
const config = new AptosConfig({ network });
const client = new Aptos(config);
```
### Resetting the local network
[Section titled “Resetting the local network”](#resetting-the-local-network)
Sometimes while developing it is helpful to reset the local network back to its initial state, for example:
* You made backwards incompatible changes to a Move module, and you’d like to redeploy it without renaming it or using a new account.
* You are building an [Indexer SDK](/build/indexer/indexer-sdk) custom processor and would like to index using a fresh network.
* You want to clear all on chain state, e.g., accounts, objects, etc.
To start with a brand new local network, use the `--force-restart` flag:
```shellscript
aptos node run-local-testnet --force-restart
```
It will then prompt you if you really want to restart the chain, to ensure that you do not delete your work by accident.
```shellscript
Are you sure you want to delete the existing chain? [yes/no]
> yes
```
If you do not want to be prompted, include `--assume-yes` as well:
```shellscript
aptos node run-local-testnet --force-restart --assume-yes
```
# Setup CLI Initial Configuration
If you are using the CLI to try things out on-chain, you will need to configure the network, faucet, and credentials you want the CLI to use.
This makes using the CLI easier and more secure as you will not be forced to repeatedly copy addresses or private keys.
Caution
If you still need to install the CLI, follow [these steps](/build/cli/install-cli/install-cli-specific-version).
1. Run `aptos init` and follow the instructions in the command line.
Note
To use default settings, you can provide no input and just press “Enter”. For example:
```shellscript
aptos init
```
```shellscript
Configuring for profile default
Enter your rest endpoint [Current: None | No input: https://api.devnet.aptoslabs.com]
No rest url given, using https://api.devnet.aptoslabs.com...
Enter your faucet endpoint [Current: None | No input: https://faucet.devnet.aptoslabs.com]
No faucet url given, using https://faucet.devnet.aptoslabs.com...
Enter your private key as a hex literal (0x...) [Current: None | No input: Generate new key (or keep one if present)]
No key given, generating key...
Account 00f1f20ddd0b0dd2291b6e42c97274668c479bca70f07c6b6a80b99720779696 doesn't exist, creating it and funding it with 10000 coins
Aptos is now set up for account 00f1f20ddd0b0dd2291b6e42c97274668c479bca70f07c6b6a80b99720779696! Run `aptos help` for more information about commands
{
"Result": "Success"
}
```
2. Later, if you want to update these settings, you can do so by running `aptos init` again.
3. The rest of these configuration steps are optional / quality of life. To continue to use the CLI for your specific use case, follow the [usage guide here](/build/cli#%EF%B8%8F-using-the-aptos-cli).
## (Optional) Creating Named Configurations (Profiles)
[Section titled “(Optional) Creating Named Configurations (Profiles)”](#optional-creating-named-configurations-profiles)
For testing more complicated scenarios, you will often want multiple accounts on-chain. One way to do this is to create a named configuration which we call a profile.
To create a profile, run `aptos init --profile `. The configuration you generate will be usable when calling CLI commands as replacements for arguments.
For example:
```shellscript
aptos init --profile bob
```
```shellscript
aptos account fund-with-faucet --profile bob
```
```shellscript
{
"Result": "Added 100000000 Octas to account 0x63169727b08fc137b8720e451f7a90584ccce04c301e151daeadc7b8191fdfad"
}
```
## (Optional) Setting Up Shell Completion
[Section titled “(Optional) Setting Up Shell Completion”](#optional-setting-up-shell-completion)
One quality of life feature you can enable is shell auto-completions.
1. Determine which shell you are using (you can run `echo $SHELL` if you are unsure).
2. Look up where configuration files for shell completions go for that shell (it varies from shell to shell). The supported shells are `[bash, zsh, fish, PowerShell, elvish]`.
3. Run the following command with your specific shell and the output file for completions using your shell:
```shellscript
aptos config generate-shell-completions --shell --output-file
```
Example command for [`oh my zsh`](https://ohmyz.sh/):
```shellscript
aptos config generate-shell-completions --shell zsh --output-file ~/.oh-my-zsh/completions/_aptos
```
## (Optional) Global Config
[Section titled “(Optional) Global Config”](#optional-global-config)
By default, the CLI will look for a configuration in `.aptos/config.yaml` in each workspace directory. If you would like to use a shared configuration for all workspaces, you can follow these steps:
1. Create a folder in your home directory called `.aptos` (so it has the path `~/.aptos`).
2. Create a yaml file inside `.aptos` called `global_config.yaml`.
3. Run the command:
```shellscript
aptos config set-global-config --config-type global
```
You should see:
```json
{
"Result": {
"config_type": "Global"
}
}
```
# Install the Move Prover
If you want to use the [Move Prover](/build/smart-contracts/prover), install the Move Prover dependencies after [installing the CLI binary](/build/cli/setup-cli/.). There are two ways to install Prover dependencies.
## Installation through Aptos CLI (Recommended)
[Section titled “Installation through Aptos CLI (Recommended)”](#installation-through-aptos-cli-recommended)
1. [Install the latest Aptos CLI binary](/build/cli/install-cli/install-cli-mac).
2. Execute the command `aptos update prover-dependencies`.
Note
Environment variables `BOOGIE_EXE` and `Z3_EXE` will be set automatically after installation. Please make sure they are in effect in the current environment.
## Installation through `aptos-core` (Not Recommended)
[Section titled “Installation through aptos-core (Not Recommended)”](#installation-through-aptos-core-not-recommended)
1. See [Building Aptos From Source](/network/nodes/building-from-source)
2. Then, in the checked out aptos-core directory, install additional Move tools:
Linux / macOS
1. Open a Terminal session.
2. Run the dev setup script to prepare your environment: `./scripts/dev_setup.sh -yp`
3. Update your current shell environment: `source ~/.profile`
Note
`dev_setup.sh -p` updates your `~./profile` with environment variables to support the installed Move Prover tools. You may need to set `.bash_profile` or `.zprofile` or other setup files for your shell.
Windows
1. Open a PowerShell terminal as an administrator.
2. Run the dev setup script to prepare your environment: `PowerShell -ExecutionPolicy Bypass -File ./scripts/windows_dev_setup.ps1 -y`
After installation, you can run the Move Prover to prove an [example](https://github.com/aptos-labs/aptos-core/tree/main/aptos-move/move-examples/hello_prover):
```shellscript
aptos move prove --package-dir aptos-move/move-examples/hello_prover/
```
## Troubleshooting
[Section titled “Troubleshooting”](#troubleshooting)
If you encounter errors like the one below when running the command, double-check your Aptos CLI version or verify that you’re using the correct `aptos` tool, especially if you have multiple versions installed.
```shellscript
error: unexpected token
┌─ ~/.move/https___github_com_aptos-labs_aptos-core_git_main/aptos-move/framework/aptos-framework/sources/randomness.move:515:16
│
515 │ for (i in 0..n) {
│ - ^ Expected ')'
│ │
│ To match this '('
{
"Error": "Move Prover failed: exiting with model building errors"
}
```
# Start a Move package from a template
Follow the steps below to quickly get started.
1. Initialize
Run the following to initialize a package using the `hello-blockchain` template:
```shellscript
aptos move init --name hello_blockchain --template hello-blockchain
```
2. Start building
The template creates a `hello_blockchain.move` file under `sources` to help get you started.
hello\_blockchain.move
```move
module hello_blockchain::message {
use std::error;
use std::signer;
use std::string;
use aptos_framework::event;
#[test_only]
use std::debug;
//:!:>resource
struct MessageHolder has key {
message: string::String,
}
//<:!:resource
#[event]
struct MessageChange has drop, store {
account: address,
from_message: string::String,
to_message: string::String,
}
/// There is no message present
const ENO_MESSAGE: u64 = 0;
#[view]
public fun get_message(addr: address): string::String acquires MessageHolder {
assert!(exists(addr), error::not_found(ENO_MESSAGE));
borrow_global(addr).message
}
public entry fun set_message(account: signer, message: string::String)
acquires MessageHolder {
let account_addr = signer::address_of(&account);
if (!exists(account_addr)) {
move_to(&account, MessageHolder {
message,
})
} else {
let old_message_holder = borrow_global_mut(account_addr);
let from_message = old_message_holder.message;
event::emit(MessageChange {
account: account_addr,
from_message,
to_message: copy message,
});
old_message_holder.message = message;
}
}
#[test(account = @0x1)]
public entry fun sender_can_set_message(account: signer) acquires MessageHolder {
let msg: string::String = string::utf8(b"Running test for sender_can_set_message...");
debug::print(&msg);
let addr = signer::address_of(&account);
aptos_framework::account::create_account_for_test(addr);
set_message(account, string::utf8(b"Hello, Blockchain"));
assert!(
get_message(addr) == string::utf8(b"Hello, Blockchain"),
ENO_MESSAGE
);
}
}
```
3. See all templates
Run the following command to see all templates (and for general help initializing a package):
```shellscript
aptos move init --help
```
### Learn More
[Section titled “Learn More”](#learn-more)
[Smart Contracts ](/build/smart-contracts)Learn how to build in Move
[Create Package ](/build/smart-contracts/create-package)Get started by learning how to create a Move package
# Trying Things On-Chain With Aptos CLI
The CLI can be a convenient tool for quickly looking up on-chain data and sending transactions from your accounts.
The most common way to specify what accounts you want to interact with is through profiles. You can create a new profile on the cli by running the following command:
```shellscript
aptos init --profile
```
If any command takes an account, you can pass in the name of a profile instead. If a command implicitly uses the default profile, it will usually have an optional parameter to use a specified profile instead which you can find by running `aptos --help`.
With that, the three main things you can use the CLI to do on-chain include:
1. [Looking Up On-Chain Account Info](/build/cli/trying-things-on-chain/looking-up-account-info)
2. [Creating test accounts and sending transactions](/build/cli/trying-things-on-chain/create-test-accounts)
3. [Securely interacting on-chain via a Hardware Ledger](/build/cli/trying-things-on-chain/ledger)
# Create Test Accounts and Send Transactions From Aptos CLI
Note
You can install the Aptos CLI by following [these steps](/build/cli) if you have not done so already.
In general, to make a new account on-chain, you will need to generate keys and then fund the account. On devnet, you can fund a new account by asking a “faucet” account with test Aptos tokens to send them to your account. On testnet you can mint at the [mint page](/network/faucet).
Using the CLI, you can generate and fund a test account using:
```shellscript
aptos init --profile
```
Once you have a funded account you can send coins between accounts with the `transfer` command like this:
```shellscript
aptos account transfer --account superuser --amount 100
```
You should see a result like:
```json
{
"Result": {
"gas_used": 73,
"balance_changes": {
"742854f7dca56ea6309b51e8cebb830b12623f9c9d76c72c3242e4cad353dedc": {
"coin": {
"value": "10100"
},
"deposit_events": {
"counter": "2",
"guid": {
"id": {
"addr": "0x742854f7dca56ea6309b51e8cebb830b12623f9c9d76c72c3242e4cad353dedc",
"creation_num": "1"
}
}
},
"withdraw_events": {
"counter": "0",
"guid": {
"id": {
"addr": "0x742854f7dca56ea6309b51e8cebb830b12623f9c9d76c72c3242e4cad353dedc",
"creation_num": "2"
}
}
}
},
"b9bd2cfa58ca29bce1d7add25fce5c62220604cd0236fe3f90d9de91ed9fb8cb": {
"coin": {
"value": "9827"
},
"deposit_events": {
"counter": "1",
"guid": {
"id": {
"addr": "0xb9bd2cfa58ca29bce1d7add25fce5c62220604cd0236fe3f90d9de91ed9fb8cb",
"creation_num": "1"
}
}
},
"withdraw_events": {
"counter": "1",
"guid": {
"id": {
"addr": "0xb9bd2cfa58ca29bce1d7add25fce5c62220604cd0236fe3f90d9de91ed9fb8cb",
"creation_num": "2"
}
}
}
}
},
"sender": "b9bd2cfa58ca29bce1d7add25fce5c62220604cd0236fe3f90d9de91ed9fb8cb",
"success": true,
"version": 1139,
"vm_status": "Executed successfully"
}
}
```
This can be useful for manual testing of Move contracts or just to try seeing how the chain works in practice.
Note
To have more control over what your generated credentials look like, instead of `aptos init`, you can use:
1. `aptos key generate --vanity-prefix 0x`
2. `aptos account fund-with-faucet --account `
Note however that addresses are different than keys.
# Use Hardware Ledger via the Aptos CLI
Using a hardware wallet like Ledger is the most secure way to sign transactions on `mainnet` as your private key never leaves your device.
Caution
The `Ledger Nano S` has limited memory and may not be able to sign many transactions on Aptos. If you are trying to sign a transaction that is too big for your device to handle, you will get the error `Wrong raw transaction length`.
## Initial Setup
[Section titled “Initial Setup”](#initial-setup)
You will need to do a few steps of configuration for the Aptos CLI and your Ledger device to sign transactions.
1. Ensure you have the Aptos CLI installed.
You can install the Aptos CLI by following [these steps](/build/cli) if you have not done so already.
2. Ensure you have done the basic setup for your Ledger device.
You can find those steps on [Ledger’s website](https://www.ledger.com/). For example, here are the set up instructions for the [Ledger Nano X](https://support.ledger.com/article/360018784134-zd).
3. Plug your Ledger device into your computer.
4. Install the Aptos App on your Ledger device by following .
5. Unlock your Ledger device and open the Aptos app.
Note
Whenever you want to sign using your Ledger you will need to plug it in, unlock it, and open the Aptos app before running any CLI commands.
6. Create a new Ledger profile in the Aptos CLI
```shellscript
aptos init --profile --ledger
```
Then follow the terminal prompts like so:
```text
Configuring for profile
Choose network from [devnet, testnet, mainnet, local, custom | defaults to devnet]
No network given, using devnet...
Please choose an index from the following 5 ledger accounts, or choose an arbitrary index that you want to use:
[0] Derivation path: m/44'/637'/0'/0'/0' (Address: 59836ba1dd0c845713bdab34346688d6f1dba290dbf677929f2fc20593ba0cfb)
[1] Derivation path: m/44'/637'/1'/0'/0' (Address: 21563230cf6d69ee72a51d21920430d844ee48235e708edbafbc69708075a86e)
[2] Derivation path: m/44'/637'/2'/0'/0' (Address: 667446181b3b980ef29f5145a7a2cc34d433fc3ee8c97fc044fd978435f2cb8d)
[3] Derivation path: m/44'/637'/3'/0'/0' (Address: 2dcf037a9f31d93e202c074229a1b69ea8ee4d2f2d63323476001c65b0ec4f31)
[4] Derivation path: m/44'/637'/4'/0'/0' (Address: 23c579a9bdde1a59f1c9d36d8d379aeefe7a5997b5b58bd5a5b0c12a4f170431)
0
Account 59836ba1dd0c845713bdab34346688d6f1dba290dbf677929f2fc20593ba0cfb has been already found on-chain
---
Aptos CLI is now set up for account 59836ba1dd0c845713bdab34346688d6f1dba290dbf677929f2fc20593ba0cfb as profile ! Run `aptos --help` for more information about commands
{
"Result": "Success"
}
```
In the example, they chose to use the first ledger account by entering `0` after the `aptos init` command. You may choose whichever account you want.
**Common errors:**
1. If you see the error `Device Not Found`, make sure to unlock your Ledger then try this step again.
2. If you see the error `Aptos ledger app is not opened`, make sure to open the Aptos app on your Ledger, then try this step again.
7. Finally, you will need to enable blind signing on your Ledger device by following .
1. Blind signing allows you to confirm a smart contract interaction you cannot verify through a human-readable language.
2. This is needed to execute transactions without limitation as some payloads are too big to display.
## Signing Using Ledger
[Section titled “Signing Using Ledger”](#signing-using-ledger)
After doing the initial setup, you can sign transactions by following these steps:
1. Plug in your ledger.
2. Unlock it.
3. Open the Aptos app.
4. Run the Aptos CLI command which requires a signature.
Note
This process works for any command that requires a signature, whether that’s to transfer coins, publish a Move contract, interact with a contract, etc.
For example, if you wanted to publish a Move package like the [`hello_blockchain`](https://github.com/aptos-labs/aptos-core/tree/main/aptos-move/move-examples/hello_blockchain) demo contract you could follow the above steps then run:
```shellscript
aptos move publish --profile --named-addresses hello_blockchain=
```
You should see a response like:
```shellscript
Compiling, may take a little while to download git dependencies...
INCLUDING DEPENDENCY AptosFramework
INCLUDING DEPENDENCY AptosStdlib
INCLUDING DEPENDENCY MoveStdlib
BUILDING Examples
package size 1755 bytes
Do you want to submit a transaction for a range of [139600 - 209400] Octas at a gas unit price of 100 Octas? [yes/no] >
yes
{
"Result": {
"transaction_hash": "0xd5a12594f85284cfd5518d547d084030b178ee926fa3d8cbf699cc0596eff538",
"gas_used": 1396,
"gas_unit_price": 100,
"sender": "59836ba1dd0c845713bdab34346688d6f1dba290dbf677929f2fc20593ba0cfb",
"sequence_number": 0,
"success": true,
"timestamp_us": 1689887104333038,
"version": 126445,
"vm_status": "Executed successfully"
}
}
```
After you have approved publishing this package you will be prompted to sign the transaction on your Ledger device. Once signed, the package will be published to the network!
One error you might run into is `Error: Wrong raw transaction length`. This means that the transaction or package size was too big for your device to sign. Currently the Aptos Ledger app can only support transactions that are smaller than 20kb. The `Ledger Nano S` device has less memory than that, which is why it is more likely to produce this error.
## Authentication key rotation
[Section titled “Authentication key rotation”](#authentication-key-rotation)
If you have an active account that is not secured using a hardware wallet, then you may wish to rotate the account’s authentication key so that it corresponds to a [BIP44 account index](https://github.com/bitcoin/bips/blob/master/bip-0044.mediawiki) private key held on your Ledger.
Alternatively, if you have an account linked with a Ledger hardware wallet that you wish to publish a large package from, you might want to temporarily rotate the account’s authentication key to a hot key to avoid memory issues.
This tutorial will walk you through both scenarios.
Caution
Before you start this tutorial make sure you have completed the [key rotation guide](/build/guides/key-rotation).
1. Complete the key rotation guide
Confirm that you have completed the [key rotation guide](/build/guides/key-rotation).
2. Verify your Ledger is ready
1. Connect and unlock your Ledger.
2. Check what version of the Aptos app you have: `Aptos > About > Version`.
3. If you do not have version `0.6.9` or higher, update it using Ledger Live.
4. Enable blind signing: `Aptos > Settings > Enable Blind Signing`.
3. Start a localnet
Start a localnet:
```shellscript
aptos node run-localnet
```
The localnet is ready when it prints out:
```shellscript
Applying post startup steps...
Setup is complete, you can now use the localnet!
```
Note
If you are a power user on MacOS or Linux, the following command can be used to start a fresh localnet as a background process:
```shellscript
mkdir -p localnet-data
aptos node run-localnet \
--assume-yes \
--test-dir localnet-data \
--force-restart &
export LOCALNET_PID=$!
```
You can then stop the localnet at any point with the following command:
```shellscript
kill $LOCALNET_PID
```
4. Set up localnet hot wallet profile
Create a private key corresponding to an authentication key, and thus initial account address, that starts with the vanity prefix `0xaaa`:
```shellscript
aptos key generate \
--assume-yes \
--output-file private-key-a \
--vanity-prefix 0xaaa
```
Example output
```shellscript
{
"Result": {
"PublicKey Path": "private-key-a.pub",
"PrivateKey Path": "private-key-a",
"Account Address:": "0xaaac71af5f2a4af4ec2639a15799bf9b945afb061c8bee102b636531c1b00eb5"
}
}
```
Use the private key to initialize a `hot-wallet-1` profile on the localnet:
```shellscript
aptos init \
--assume-yes \
--network local \
--private-key-file private-key-a \
--profile hot-wallet-1
```
Example output
```shellscript
Configuring for profile hot-wallet-1
Configuring for network Local
Using command line argument for private key
Account 0xaaac71af5f2a4af4ec2639a15799bf9b945afb061c8bee102b636531c1b00eb5 doesn\'t exist, creating it and funding it with 100000000 Octas
Account 0xaaac71af5f2a4af4ec2639a15799bf9b945afb061c8bee102b636531c1b00eb5 funded successfully
---
Aptos CLI is now set up for account 0xaaac71af5f2a4af4ec2639a15799bf9b945afb061c8bee102b636531c1b00eb5 as profile hot-wallet-1! Run `aptos --help` for more information about commands
{
"Result": "Success"
}
```
5. Rotate the hot wallet key
Rotate the authentication key of the hot wallet to use [BIP44 account index](https://github.com/bitcoin/bips/blob/master/bip-0044.mediawiki) 1000 on your Ledger:
```shellscript
aptos account rotate-key \
--assume-yes \
--new-derivation-index 1000 \
--profile hot-wallet-1 \
--save-to-profile ledger-wallet-1000
```
Note
As a best practice, this command uses a [BIP44 account index](https://github.com/bitcoin/bips/blob/master/bip-0044.mediawiki) that starts at a large number (1000) to indicate that the account is secured by a rotated authentication key on a Ledger, to ensure it does not conflict with any other existing accounts.
This practice aids in profile recovery, as shown below.
Follow the instructions from the CLI prompt:
```shellscript
Approve rotation proof challenge signature on your Ledger device
```
Example output
```shellscript
{
"Result": {
"message": "Saved new profile ledger-wallet-1000",
"transaction": {
"transaction_hash": "0x1a6df99651ac170bda10cfb9898fa196321d80a928033791b9d2231f77738bb2",
"gas_used": 448,
"gas_unit_price": 100,
"sender": "aaac71af5f2a4af4ec2639a15799bf9b945afb061c8bee102b636531c1b00eb5",
"sequence_number": 0,
"success": true,
"timestamp_us": 1717986382369736,
"version": 186,
"vm_status": "Executed successfully"
}
}
}
```
Compare the `hot-wallet-1` and `ledger-wallet-1000` profiles, noting that they have the same `account` address but different `public_key` values:
```shellscript
aptos config show-profiles --profile hot-wallet-1
aptos config show-profiles --profile ledger-wallet-1000
```
Example output
```shellscript
{
"Result": {
"hot-wallet-1": {
"has_private_key": true,
"public_key": "0xffb1240fd1267207cc3ed2e1b5386e090a9ca2c844d7f9e0077b3d7dd5d5e430",
"account": "aaa271bca468fb8518f73a732a484b29a1bc296ebcb23f15639d4865a5cebe87",
"rest_url": "http://localhost:8080",
"faucet_url": "http://localhost:8081"
}
}
}
{
"Result": {
"ledger-wallet-1000": {
"has_private_key": false,
"public_key": "0x20ba83f9b9fdab73b0ace8fda26ce24c98cf55060b72b69cfbd25add6a25d09b",
"account": "aaa271bca468fb8518f73a732a484b29a1bc296ebcb23f15639d4865a5cebe87",
"rest_url": "http://localhost:8080",
"faucet_url": "http://localhost:8081"
}
}
}
```
Since the account is no longer secured by the hot private key, delete the private and public key files.
Note
If you are using a UNIX-like machine:
```shell
rm private-key-a
rm private-key-b
rm private-key-a.pub
rm private-key-b.pub
```
Now that you have successfully rotated the authentication key of the hot wallet, you can delete the profiles too:
```shellscript
aptos config delete-profile --profile hot-wallet-1
aptos config delete-profile --profile ledger-wallet-1000
```
Example output
```shellscript
{
"Result": "Deleted profile hot-wallet-1"
}
{
"Result": "Deleted profile ledger-wallet-1000"
}
```
6. Recover profile
Since you know that you rotated the authentication key of the hot wallet to the Ledger, and since you used the best practice of a [BIP44 account index](https://github.com/bitcoin/bips/blob/master/bip-0044.mediawiki) offset of 1000, you can easily recover the profile using the [BIP44 account index](https://github.com/bitcoin/bips/blob/master/bip-0044.mediawiki) alone:
```shellscript
aptos init \
--assume-yes \
--derivation-index 1000 \
--network local \
--profile ledger-wallet-1000-recovered
```
Example output
```shellscript
Configuring for profile ledger-wallet-1000-recovered
Configuring for network Local
Account 0xaaac71af5f2a4af4ec2639a15799bf9b945afb061c8bee102b636531c1b00eb5 has been already found onchain
---
Aptos CLI is now set up for account 0xaaac71af5f2a4af4ec2639a15799bf9b945afb061c8bee102b636531c1b00eb5 as profile ledger-wallet-1000-recovered! Run `aptos --help` for more information about commands
{
"Result": "Success"
}
```
Note that this profile corresponds to the specified `0xaaa...` vanity account address:
```shellscript
aptos config show-profiles --profile ledger-wallet-1000-recovered
```
Example output
```shellscript
{
"Result": {
"ledger-wallet-1000-recovered": {
"has_private_key": false,
"public_key": "0x20ba83f9b9fdab73b0ace8fda26ce24c98cf55060b72b69cfbd25add6a25d09b",
"account": "aaac71af5f2a4af4ec2639a15799bf9b945afb061c8bee102b636531c1b00eb5",
"rest_url": "http://localhost:8080",
"faucet_url": "http://localhost:8081"
}
}
}
```
Note
The `aptos init` command first checks the [`account::OriginatingAddress`](https://github.com/aptos-labs/aptos-core/blob/acb6c891cd42a63b3af96561a1aca164b800c7ee/aptos-move/framework/aptos-framework/sources/account.move#L70) table for determining the account address associated with a public key, so as long as you follow best practices from the [key rotation guide](/build/guides/key-rotation) and only authenticate one account at a time with a private key, you’ll easily be able to recover your profile based on the [BIP44 account index](https://github.com/bitcoin/bips/blob/master/bip-0044.mediawiki) alone.
7. Rotate to new hot private key
If you have an account linked with a Ledger hardware wallet that you wish to use for publication of a large package, you’ll be unable to sign the package publication transaction due to the Ledger’s memory limitations. In this case, you’ll want to temporarily rotate to a hot wallet.
Start by generating a new private key:
```shellscript
aptos key generate \
--assume-yes \
--output-file private-key-b \
--vanity-prefix 0xbbb
```
Example output
```shellscript
{
"Result": {
"PublicKey Path": "private-key-b.pub",
"PrivateKey Path": "private-key-b",
"Account Address:": "0xbbbede2b4f1d49eff0b156ab0756889a6f2bb68f215399d5015da9ac45921b47"
}
}
```
Rotate the authentication key of the account linked with the Ledger to the new private key:
```shellscript
aptos account rotate-key \
--assume-yes \
--new-private-key-file private-key-b \
--profile ledger-wallet-1000-recovered \
--save-to-profile temporary-hot-wallet
```
Follow the instructions from the CLI prompt:
```shellscript
Approve rotation proof challenge signature on your Ledger device
```
```shellscript
Approve transaction on your Ledger device
```
Example output
```shellscript
{
"Result": {
"message": "Saved new profile temporary-hot-wallet",
"transaction": {
"transaction_hash": "0xe49782e92d8fd824fd6dce8f6ed42a11cf8ee84c201f3aa639c435e737c80eaa",
"gas_used": 449,
"gas_unit_price": 100,
"sender": "aaac71af5f2a4af4ec2639a15799bf9b945afb061c8bee102b636531c1b00eb5",
"sequence_number": 1,
"success": true,
"timestamp_us": 1717986617911082,
"version": 631,
"vm_status": "Executed successfully"
}
}
```
Since the CLI profile `ledger-wallet-1000-recovered` is now stale, rename it in case you get interrupted and forget that the private key has been rotated:
```shellscript
aptos config rename-profile \
--profile ledger-wallet-1000-recovered \
--new-profile-name ledger-wallet-1000-stale
```
Example output
```shellscript
{
"Result": "Renamed profile ledger-wallet-1000-recovered to ledger-wallet-1000-stale"
}
```
8. Rotate back to Ledger
Once you’ve signed the large package publication transaction with the hot key, you can then rotate the authentication key back to the corresponding to the private key on the Ledger at index 1000:
```shellscript
aptos account rotate-key \
--assume-yes \
--new-derivation-index 1000 \
--profile temporary-hot-wallet \
--save-to-profile ledger-wallet-1000
```
Follow the instructions from the CLI prompt:
```shellscript
Approve rotation proof challenge signature on your Ledger device
```
Example output
```shellscript
{
"Result": {
"message": "Saved new profile ledger-wallet-1000",
"transaction": {
"transaction_hash": "0x9503819d4ea13bcd9eafed25984807d86d22e8a9837565a7495b54d13890d103",
"gas_used": 449,
"gas_unit_price": 100,
"sender": "aaac71af5f2a4af4ec2639a15799bf9b945afb061c8bee102b636531c1b00eb5",
"sequence_number": 2,
"success": true,
"timestamp_us": 1717986672963544,
"version": 742,
"vm_status": "Executed successfully"
}
}
}
```
Verify that the `ledger-wallet-1000-stale` and `ledger-wallet-1000` profiles have the same `account` address and `public_key`:
```shellscript
aptos config show-profiles --profile ledger-wallet-1000-stale
aptos config show-profiles --profile ledger-wallet-1000
```
Delete the `temporary-hot-wallet` and `ledger-wallet-1000-stale` profiles, which you no longer need.
```shellscript
aptos config delete-profile --profile temporary-hot-wallet
aptos config delete-profile --profile ledger-wallet-1000-stale
```
Example output
```shellscript
{
"Result": "Deleted profile temporary-hot-wallet"
}
{
"Result": "Deleted profile ledger-wallet-1000-stale"
}
```
Since you no longer need the temporary private key, delete it too.
Note
If you are using a UNIX-like machine:
```shell
rm private-key-*
```
9. Clean up
Delete the remaining test profile:
```shell
aptos config delete-profile --profile ledger-wallet-1000
```
Then stop the localnet.
Note
If you are using a UNIX-like machine:
```shell
aptos config delete-profile --profile ledger-wallet-1000
kill $LOCALNET_PID
rm -fr localnet-data
```
# Look Up On-Chain Account Info Using Aptos CLI
Note
You can install the Aptos CLI by following [these steps](/build/cli) if you have not done so already.
You can look up resources and data an account has on-chain by running the following command:
```shellscript
aptos account list --account
```
This will show all resources that an account has. For example, below shows the balance as `coin:value`, and the associated coin for the native gas token APT would be `0x1::aptos_coin::AptosCoin`. This is represented in subdivisions, so in this case it’s `10^-8` or 8 zeros of decimal points.
```json
{
"Result": [
{
"coin": {
"value": "110000"
},
"deposit_events": {
"counter": "3",
"guid": {
"id": {
"addr": "0xf1f20ddd0b0dd2291b6e42c97274668c479bca70f07c6b6a80b99720779696",
"creation_num": "2"
}
}
},
"frozen": false,
"withdraw_events": {
"counter": "0",
"guid": {
"id": {
"addr": "0xf1f20ddd0b0dd2291b6e42c97274668c479bca70f07c6b6a80b99720779696",
"creation_num": "3"
}
}
}
}
]
}
```
If you’re interested in a specific type of account data, you can specify that with the `--query` parameter. The supported queries are:
* `balance` - to see the current balance and a list of deposit and withdrawal events.
* `modules` - see the Move contracts that are published on this account.
* `resources` - this is what the default command does with no query specified.
Here’s an example of what calling with the `--query modules` parameter looks like:
```shellscript
aptos account list --query modules
```
This will show all modules that an account has. For example:
```json
{
"Result": [
{
"bytecode": "0xa11ceb0b050000000b01000a020a12031c2504410405452d0772da0108cc0240068c030a0a9603150cab03650d90040400000101010201030104000506000006080004070700020e0401060100080001000009020300010f0404000410060100031107000002120709010602130a030106050806080105010802020c0a02000103040508020802070801010a0201060c010800010b0301090002070b030109000900074d657373616765056572726f72056576656e74067369676e657206737472696e67124d6573736167654368616e67654576656e740d4d657373616765486f6c64657206537472696e670b6765745f6d6573736167650b7365745f6d6573736167650c66726f6d5f6d6573736167650a746f5f6d657373616765076d657373616765156d6573736167655f6368616e67655f6576656e74730b4576656e7448616e646c65096e6f745f666f756e6404757466380a616464726573735f6f66106e65775f6576656e745f68616e646c650a656d69745f6576656e74b9bd2cfa58ca29bce1d7add25fce5c62220604cd0236fe3f90d9de91ed9fb8cb0000000000000000000000000000000000000000000000000000000000000001030800000000000000000002020a08020b08020102020c08020d0b030108000001000101030b0a002901030607001102270b002b0110001402010104010105240b0111030c040e0011040c020a02290120030b05120e000b040e00380012012d0105230b022a010c050a051000140c030a050f010b030a04120038010b040b050f0015020100010100",
"abi": {
"address": "0xb9bd2cfa58ca29bce1d7add25fce5c62220604cd0236fe3f90d9de91ed9fb8cb",
"name": "Message",
"friends": [],
"exposed_functions": [
{
"name": "get_message",
"visibility": "public",
"is_entry": false,
"generic_type_params": [],
"params": [
"address"
],
"return": [
"0x1::string::String"
]
},
{
"name": "set_message",
"visibility": "public",
"is_entry": true,
"generic_type_params": [],
"params": [
"signer",
"vector"
],
"return": []
}
],
"structs": [
{
"name": "MessageChangeEvent",
"is_native": false,
"abilities": [
"drop",
"store"
],
"generic_type_params": [],
"fields": [
{
"name": "from_message",
"type": "0x1::string::String"
},
{
"name": "to_message",
"type": "0x1::string::String"
}
]
},
{
"name": "MessageHolder",
"is_native": false,
"abilities": [
"key"
],
"generic_type_params": [],
"fields": [
{
"name": "message",
"type": "0x1::string::String"
},
{
"name": "message_change_events",
"type": "0x1::event::EventHandle<0xb9bd2cfa58ca29bce1d7add25fce5c62220604cd0236fe3f90d9de91ed9fb8cb::Message::MessageChangeEvent>"
}
]
}
]
}
}
]
}
```
# Working With Move Contracts
The Aptos CLI is mostly used to compile, test, and formally verify Move contracts. If you have not installed the Aptos CLI yet, you can do so by following the steps here [Install the Aptos CLI](/build/cli#-install-the-aptos-cli).
You can jump to specific sections by using the table of contents on the right.
To see how to chain together Move contracts on-chain using the CLI, you can follow this [“CLI Arguments” tutorial](/build/cli/working-with-move-contracts/arguments-in-json-tutorial).
Note
Throughout this document there are parts of commands you will have to modify to fit your situation. Those variables will be wrapped in triangle brackets ``.
## 1. Compiling Move
[Section titled “1. Compiling Move”](#1-compiling-move)
You can compile a Move package by running:
```shellscript
aptos move compile --package-dir
```
Note
The package directory is the folder which contains the `Move.toml` file.
Based on the settings in your `Move.toml` file, you may need to pass in additional information to that compile command.
For example, if you look at the [hello\_blockchain example Move contract](https://github.com/aptos-labs/aptos-core/tree/main/aptos-move/move-examples/hello_blockchain), in the `Move.toml` file it specifies a variable named address called `hello_blockchain`.
```toml
[addresses]
hello_blockchain = "_"
```
So, to compile this, you will need to pass in the value for `hello_blockchain` with the `--named-addresses` parameter. You can use either a full address e.g. `0x123456...7890` or a name of a profile in the CLI e.g. `default` or `superuser`.
Below we will use `default` in our example:
```shellscript
aptos move compile --package-dir aptos-move/move-examples/hello_blockchain/ --named-addresses hello_blockchain=default
```
You can learn more about optional parameters when compiling Move contracts by running `aptos move compile --help`.
## 2. Unit Testing Move Contracts
[Section titled “2. Unit Testing Move Contracts”](#2-unit-testing-move-contracts)
The Aptos CLI can also be used to compile and run unit tests locally by running:
```shellscript
aptos move test --package-dir
```
This command both compiles and runs tests, so it needs all the same optional parameters you use when compiling.
You can learn more about the optional parameters for testing move contracts by running `aptos move test --help`.
### Printing Debugging Information
[Section titled “Printing Debugging Information”](#printing-debugging-information)
When writing tests, it can be helpful to print out debug information or stack traces. You can do that by using `debug::print` and `debug::print_stack_trace` to print information when you use `aptos move test`. See an example of how they are used in [DebugDemo.move](https://github.com/aptos-labs/aptos-core/blob/main/crates/aptos/debug-move-example/sources/DebugDemo.move).
To see the output of testing [DebugDemo.move](https://github.com/aptos-labs/aptos-core/blob/main/crates/aptos/debug-move-example/sources/DebugDemo.move)’s package:
1. Clone `[aptos-core](https://github.com/aptos-labs/aptos-core)`.
2. Navigate to the [debug-move-example](https://github.com/aptos-labs/aptos-core/tree/main/crates/aptos/debug-move-example) by running `cd crates/aptos/debug-move-example`.
3. Run `aptos move test`.
You should see:
```shellscript
Running Move unit tests
[debug] 0000000000000000000000000000000000000000000000000000000000000001
Call Stack:
[0] 0000000000000000000000000000000000000000000000000000000000000001::Message::sender_can_set_message
Code:
[4] CallGeneric(0)
[5] MoveLoc(0)
[6] LdConst(0)
> [7] Call(1)
[8] Ret
Locals:
[0] -
[1] 0000000000000000000000000000000000000000000000000000000000000001
Operand Stack:
```
For more on how to write unit tests with Move, follow this [Move tutorial](https://github.com/aptos-labs/aptos-core/tree/main/aptos-move/move-examples/move-tutorial) (step 2 focuses on unit tests).
## 3. Generating Test Coverage Reports
[Section titled “3. Generating Test Coverage Reports”](#3-generating-test-coverage-reports)
The Aptos CLI can be used to analyze and improve the testing of your Move modules. To use this feature:
To see the code coverage of your tests run the following command from your Move package’s directory:
```shellscript
aptos move test --coverage
```
If you would like to focus your coverage down to specific packages, you can do so with the `--filter` option. To narrow even further to specific Move modules, use the `--module` parameter.
For more detailed / advanced coverage information (such as your test coverage in the compiled bytecode) you can run `aptos move coverage` . With that command, the CLI will prompt you for more details on what specifically you would like more coverage information about.
You can learn more about optional parameters for test coverage by running `aptos move test --help` and `aptos move coverage --help`.
## 4. Publishing Move Contracts
[Section titled “4. Publishing Move Contracts”](#4-publishing-move-contracts)
To publish a Move contract, you will need to run:
```shellscript
aptos move publish --package-dir
```
Note that when you are publishing on the main network, the credentials you pass into optional parameters like `--named-addresses` will need to reflect accounts on that network instead of test credentials.
The package will be published to your default profile in the CLI. You can override that to specify which account to publish to using `--profile` in the command. To generate a new profile for a specific account, use `aptos init --profile ` and follow the prompts.
Please also note that when publishing Move modules, if multiple modules are in one package, then all modules in that package must use the same account. If they use different accounts, then the publishing will fail at the transaction level.
You can estimate the gas fees associated with publishing your Move contract by using the [Gas Profiler](/build/cli/working-with-move-contracts/local-simulation-benchmarking-and-gas-profiling).
Caution
By default Move contracts publish their source code. To avoid publishing with source code, publish with the `--included-artifacts none` argument.
Since the Aptos blockchain is inherently open by design, note that even without source access it is possible to regenerate Move source from published Move bytecode.
## 5. Running Published Contracts
[Section titled “5. Running Published Contracts”](#5-running-published-contracts)
Now that you have published your Move package, you can run it directly from the CLI.
You will first need to construct your `function-id` by combining:
```jsx
::::
```
You can then pass in args by using the `--args` parameter.
As an example, if you were to have published the [hello\_blockchain example package](https://github.com/aptos-labs/aptos-core/tree/main/aptos-move/move-examples/hello_blockchain) to an account with an address `b9bd2cfa58ca29bce1d7add25fce5c62220604cd0236fe3f90d9de91ed9fb8cb` you could run its `set_message` function via the following command:
```shellscript
aptos move run --function-id 0xb9bd2cfa58ca29bce1d7add25fce5c62220604cd0236fe3f90d9de91ed9fb8cb::message::set_message --args string:hello!
```
Which should result in:
```json
{
"Result": {
"changes": [
{
"address": "b9bd2cfa58ca29bce1d7add25fce5c62220604cd0236fe3f90d9de91ed9fb8cb",
"data": {
"authentication_key": "0xb9bd2cfa58ca29bce1d7add25fce5c62220604cd0236fe3f90d9de91ed9fb8cb",
"self_address": "0xb9bd2cfa58ca29bce1d7add25fce5c62220604cd0236fe3f90d9de91ed9fb8cb",
"sequence_number": "3"
},
"event": "write_resource",
"resource": "0x1::account::Account"
},
{
"address": "b9bd2cfa58ca29bce1d7add25fce5c62220604cd0236fe3f90d9de91ed9fb8cb",
"data": {
"coin": {
"value": "9777"
},
"deposit_events": {
"counter": "1",
"guid": {
"id": {
"addr": "0xb9bd2cfa58ca29bce1d7add25fce5c62220604cd0236fe3f90d9de91ed9fb8cb",
"creation_num": "1"
}
}
},
"withdraw_events": {
"counter": "1",
"guid": {
"id": {
"addr": "0xb9bd2cfa58ca29bce1d7add25fce5c62220604cd0236fe3f90d9de91ed9fb8cb",
"creation_num": "2"
}
}
}
},
"event": "write_resource",
"resource": "0x1::coin::CoinStore<0x1::aptos_coin::AptosCoin>"
},
{
"address": "b9bd2cfa58ca29bce1d7add25fce5c62220604cd0236fe3f90d9de91ed9fb8cb",
"data": {
"counter": "4"
},
"event": "write_resource",
"resource": "0x1::guid::Generator"
},
{
"address": "b9bd2cfa58ca29bce1d7add25fce5c62220604cd0236fe3f90d9de91ed9fb8cb",
"data": {
"message": "hello!",
"message_change_events": {
"counter": "0",
"guid": {
"id": {
"addr": "0xb9bd2cfa58ca29bce1d7add25fce5c62220604cd0236fe3f90d9de91ed9fb8cb",
"creation_num": "3"
}
}
}
},
"event": "write_resource",
"resource": "0xb9bd2cfa58ca29bce1d7add25fce5c62220604cd0236fe3f90d9de91ed9fb8cb::Message::MessageHolder"
}
],
"gas_used": 41,
"success": true,
"version": 3488,
"vm_status": "Executed successfully"
}
}
```
## 6. (Optional) Formally Verifying Move Scripts
[Section titled “6. (Optional) Formally Verifying Move Scripts”](#6-optional-formally-verifying-move-scripts)
For cases where you want to guarantee that your code works as expected beyond unit testing, you can use the [Move Prover](/build/smart-contracts/prover) to formally verify your Move contract code.
You can install the Move Prover by following [these steps](/build/cli/setup-cli/install-move-prover).
Once you have installed the Move Prover, you can use it from the Aptos CLI by running:
```shellscript
aptos move prove --package-dir
```
To learn how to formally verify your code, please follow the in-depth Move tutorial [here](https://github.com/aptos-labs/aptos-core/tree/main/aptos-move/move-examples/move-tutorial) (step 7 and 8 cover how to use the Move Prover and write formal specifications in the example code).
# Arguments in JSON Tutorial
## Package info
[Section titled “Package info”](#package-info)
This section references the [`CliArgs` example package](https://github.com/aptos-labs/aptos-core/tree/main/aptos-move/move-examples/cli_args), which contains the following manifest:
```toml
[package]
name = "CliArgs"
version = "0.1.0"
upgrade_policy = "compatible"
[addresses]
test_account = "_"
[dependencies]
AptosFramework = { git = "https://github.com/aptos-labs/aptos-framework.git", rev = "mainnet", subdir = "aptos-framework" }
```
Here, the package is deployed under the named address `test_account`.
Note
Set your working directory to [`aptos-move/move-examples/cli_args`](https://github.com/aptos-labs/aptos-core/tree/main/aptos-move/move-examples/cli_args) to follow along:
```shellscript
cd /aptos-core/aptos-move/move-examples/cli_args
```
## Deploying the package
[Section titled “Deploying the package”](#deploying-the-package)
Start by mining a vanity address for Ace, who will deploy the package:
```shellscript
aptos key generate \
--vanity-prefix 0xace \
--output-file ace.key
```
Output
```shellscript
{
"Result": {
"Account Address:": "0xacef1b9b7d4ab208b99fed60746d18dcd74865edb7eb3c3f1428233988e4ba46",
"PublicKey Path": "ace.key.pub",
"PrivateKey Path": "ace.key"
}
}
```
Note
The exact account address should vary for each run, though the vanity prefix should not.
Store Ace’s address in a shell variable, so you can call it inline later on:
```shellscript
# Your exact address will vary
ace_addr=0xacef1b9b7d4ab208b99fed60746d18dcd74865edb7eb3c3f1428233988e4ba46
```
Fund Ace’s account with the faucet (only works on devnet):
```shellscript
aptos account fund-with-faucet --account $ace_addr
```
Output
```shellscript
{
"Result": "Added 100000000 Octas to account acef1b9b7d4ab208b99fed60746d18dcd74865edb7eb3c3f1428233988e4ba46"
}
```
Now publish the package under Ace’s account:
```shellscript
aptos move publish \
--named-addresses test_account=$ace_addr \
--private-key-file ace.key \
--assume-yes
```
Output
```json
{
"Result": {
"transaction_hash": "0x1d7b074dd95724c5459a1c30fe4cb3875e7b0478cc90c87c8e3f21381625bec1",
"gas_used": 1294,
"gas_unit_price": 100,
"sender": "acef1b9b7d4ab208b99fed60746d18dcd74865edb7eb3c3f1428233988e4ba46",
"sequence_number": 0,
"success": true,
"timestamp_us": 1685077849297587,
"version": 528422121,
"vm_status": "Executed successfully"
}
}
```
## Entry functions
[Section titled “Entry functions”](#entry-functions)
The only module in the package, `cli_args.move`, defines a simple `Holder` resource with fields of various data types:
```move
module test_account::cli_args {
use std::signer;
use aptos_std::type_info::{Self, TypeInfo};
use std::string::String;
struct Holder has key, drop {
u8_solo: u8,
bytes: vector,
utf8_string: String,
bool_vec: vector,
address_vec_vec: vector>,
type_info_1: TypeInfo,
type_info_2: TypeInfo,
}
```
A public entry function with multi-nested vectors can be used to set the fields:
```move
/// Set values in a `Holder` under `account`.
public entry fun set_vals(
account: signer,
u8_solo: u8,
bytes: vector,
utf8_string: String,
bool_vec: vector,
address_vec_vec: vector>,
) acquires Holder {
let account_addr = signer::address_of(&account);
if (exists(account_addr)) {
move_from(account_addr);
};
move_to(&account, Holder {
u8_solo,
bytes,
utf8_string,
bool_vec,
address_vec_vec,
type_info_1: type_info::type_of(),
type_info_2: type_info::type_of(),
});
}
```
After the package has been published, `aptos move run` can be used to call `set_vals()`:
Note
To pass vectors (including nested vectors) as arguments from the command line, use JSON syntax escaped with quotes!
```shellscript
aptos move run \
--function-id $ace_addr::cli_args::set_vals \
--type-args \
0x1::account::Account \
0x1::chain_id::ChainId \
--args \
u8:123 \
"hex:0x1234" \
"string:hello, world\! ♥" \
"bool:[false, true, false, false]" \
'address:[["0xace", "0xbee"], ["0xcad"], []]' \
--private-key-file ace.key \
--assume-yes
```
Output
```json
{
"Result": {
"transaction_hash": "0x5e141dc6c28e86fa9f5594de93d07a014264ebadfb99be6db922a929eb1da24f",
"gas_used": 504,
"gas_unit_price": 100,
"sender": "acef1b9b7d4ab208b99fed60746d18dcd74865edb7eb3c3f1428233988e4ba46",
"sequence_number": 1,
"success": true,
"timestamp_us": 1685077888820037,
"version": 528422422,
"vm_status": "Executed successfully"
}
}
```
The function ID, type arguments, and arguments can alternatively be specified in a JSON file:
```json
{
"function_id": "::cli_args::set_vals",
"type_args": [
"0x1::account::Account",
"0x1::chain_id::ChainId"
],
"args": [
{
"type": "u8",
"value": 123
},
{
"type": "hex",
"value": "0x1234"
},
{
"type": "string",
"value": "hello, world! ♥"
},
{
"type": "bool",
"value": [
false,
true,
false,
false
]
},
{
"type": "address",
"value": [
[
"0xace",
"0xbee"
],
[
"0xcad"
],
[]
]
}
]
}
```
Here, the call to `aptos move run` looks like:
```shellscript
aptos move run \
--json-file entry_function_arguments.json \
--private-key-file ace.key \
--assume-yes
```
Output
```json
{
"Result": {
"transaction_hash": "0x60a32315bb48bf6d31629332f6b1a3471dd0cb016fdee8d0bb7dcd0be9833e60",
"gas_used": 3,
"gas_unit_price": 100,
"sender": "acef1b9b7d4ab208b99fed60746d18dcd74865edb7eb3c3f1428233988e4ba46",
"sequence_number": 2,
"success": true,
"timestamp_us": 1685077961499641,
"version": 528422965,
"vm_status": "Executed successfully"
}
}
```
Note
If you are trying to run the example yourself don’t forget to substitute Ace’s actual address for `` in `entry_function_arguments.json`!
## View functions
[Section titled “View functions”](#view-functions)
Once the values in a `Holder` have been set, the `reveal()` view function can be used to check the first three fields, and to compare type arguments against the last two fields:
```move
struct RevealResult has drop {
u8_solo: u8,
bytes: vector,
utf8_string: String,
bool_vec: vector,
address_vec_vec: vector>,
type_info_1_match: bool,
type_info_2_match: bool
}
#[view]
/// Pack into a `RevealResult` the first three fields in host's
/// `Holder`, as well as two `bool` flags denoting if `T1` & `T2`
/// respectively match `Holder.type_info_1` & `Holder.type_info_2`,
/// then return the `RevealResult`.
public fun reveal(host: address): RevealResult acquires Holder {
let holder_ref = borrow_global(host);
RevealResult {
u8_solo: holder_ref.u8_solo,
bytes: holder_ref.bytes,
utf8_string: holder_ref.utf8_string,
bool_vec: holder_ref.bool_vec,
address_vec_vec: holder_ref.address_vec_vec,
type_info_1_match:
type_info::type_of() == holder_ref.type_info_1,
type_info_2_match:
type_info::type_of() == holder_ref.type_info_2
}
}
```
This view function can be called with arguments specified either from the CLI or from a JSON file:
```shellscript
aptos move view \
--function-id $ace_addr::cli_args::reveal \
--type-args \
0x1::account::Account \
0x1::account::Account \
--args address:$ace_addr
```
```shellscript
aptos move view --json-file view_function_arguments.json
```
Note
If you are trying to run the example yourself don’t forget to substitute Ace’s actual address for `` in `view_function_arguments.json` (twice)!
```json
{
"function_id": "::cli_args::reveal",
"type_args": [
"0x1::account::Account",
"0x1::account::Account"
],
"args": [
{
"type": "address",
"value": ""
}
]
}
```
```shellscript
{
"Result": [
{
"address_vec_vec": [
[
"0xace",
"0xbee"
],
[
"0xcad"
],
[]
],
"bool_vec": [
false,
true,
false,
false
],
"bytes": "0x1234",
"type_info_1_match": true,
"type_info_2_match": false,
"u8_solo": 123,
"utf8_string": "hello, world! ♥"
}
]
}
```
## Script functions
[Section titled “Script functions”](#script-functions)
The package also contains a script, `set_vals.move`, which is a wrapper for the setter function:
```move
script {
use test_account::cli_args;
use std::vector;
use std::string::String;
/// Get a `bool` vector where each element indicates `true` if the
/// corresponding element in `u8_vec` is greater than `u8_solo`.
/// Then pack `address_solo` in a `vector>` and
/// pass resulting argument set to public entry function.
fun set_vals(
account: signer,
u8_solo: u8,
bytes: vector,
utf8_string: String,
u8_vec: vector,
address_solo: address,
) {
let bool_vec = vector::map_ref(&u8_vec, |e_ref| *e_ref > u8_solo);
let addr_vec_vec = vector[vector[address_solo]];
cli_args::set_vals(account, u8_solo, bytes, utf8_string, bool_vec, addr_vec_vec);
}
}
```
First compile the package (this will compile the script):
```shellscript
aptos move compile --named-addresses test_account=$ace_addr
```
Output
```json
{
"Result": [
"acef1b9b7d4ab208b99fed60746d18dcd74865edb7eb3c3f1428233988e4ba46::cli_args"
]
}
```
Next, run `aptos move run-script`:
```shellscript
aptos move run-script \
--compiled-script-path build/CliArgs/bytecode_scripts/set_vals.mv \
--type-args \
0x1::account::Account \
0x1::chain_id::ChainId \
--args \
u8:123 \
"hex:0x1234" \
"string:hello, world\! ♥" \
"u8:[122, 123, 124, 125]" \
address:"0xace" \
--private-key-file ace.key \
--assume-yes
```
Output
```json
{
"Result": {
"transaction_hash": "0x1d644eba8187843cc43919469112339bc2c435a49a733ac813b7bc6c79770152",
"gas_used": 3,
"gas_unit_price": 100,
"sender": "acef1b9b7d4ab208b99fed60746d18dcd74865edb7eb3c3f1428233988e4ba46",
"sequence_number": 3,
"success": true,
"timestamp_us": 1685078415935612,
"version": 528426413,
"vm_status": "Executed successfully"
}
}
```
```shellscript
aptos move run-script \
--compiled-script-path build/CliArgs/bytecode_scripts/set_vals.mv \
--json-file script_function_arguments.json \
--private-key-file ace.key \
--assume-yes
```
Output
```json
{
"Result": {
"transaction_hash": "0x840e2d6a5ab80d5a570effb3665f775f1755e0fd8d76e52bfa7241aaade883d7",
"gas_used": 3,
"gas_unit_price": 100,
"sender": "acef1b9b7d4ab208b99fed60746d18dcd74865edb7eb3c3f1428233988e4ba46",
"sequence_number": 4,
"success": true,
"timestamp_us": 1685078516832128,
"version": 528427132,
"vm_status": "Executed successfully"
}
}
```
```json
{
"type_args": [
"0x1::account::Account",
"0x1::chain_id::ChainId"
],
"args": [
{
"type": "u8",
"value": 123
},
{
"type": "hex",
"value": "0x1234"
},
{
"type": "string",
"value": "hello, world! ♥"
},
{
"type": "u8",
"value": [
122,
123,
124,
125
]
},
{
"type": "address",
"value": "0xace"
}
]
}
```
Both such script function invocations result in the following `reveal()` view function output:
```shellscript
aptos move view \
--function-id $ace_addr::cli_args::reveal \
--type-args \
0x1::account::Account \
0x1::chain_id::ChainId \
--args address:$ace_addr
```
```json
{
"Result": [
{
"address_vec_vec": [["0xace"]],
"bool_vec": [false, false, true, true],
"bytes": "0x1234",
"type_info_1_match": true,
"type_info_2_match": true,
"u8_solo": 123,
"utf8_string": "hello, world! ♥"
}
]
}
```
Note
As of the time of this writing, the `aptos` CLI only supports script function arguments for vectors of type `u8`, and only up to a vector depth of 1. Hence `vector` and `vector>` are invalid script function argument types.
# Local Simulation, Benchmarking & Gas Profiling
## Overview
[Section titled “Overview”](#overview)
The previous tutorial demonstrates how you can deploy and interact with Move contracts using various CLI commands.
By default, those commands send a transaction to the remote fullnode for simulation and execution. You can override this behavior and simulate the transaction locally, by appending one of the following command line options of your preference:
* `--local`: Simulate the transaction locally without conducting any further measurements or analysis.
* `--benchmark`: Benchmark the transaction and report the running time(s).
* `--profile-gas`: Profile the transaction for detailed gas usage.
These additional options can be used in combination with the following CLI commands:
* `aptos move run`
* `aptos move run-script`
* `aptos move publish`
Alternatively, if you are interested in replaying a past transaction, check out [this tutorial](/build/cli/replay-past-transactions).
Note
Local simulations do not result in any to the on-chain state.
## Deploying the Example Contract
[Section titled “Deploying the Example Contract”](#deploying-the-example-contract)
For demonstration purposes, we will continue to use the [`hello_blockchain`](https://github.com/aptos-labs/aptos-core/tree/main/aptos-move/move-examples/hello_blockchain) package as an example.
First, publish the package to devnet or testnet (if you haven’t done so already).
Change into the package directory.
```shellscript
cd aptos-move/move-examples/hello_blockchain
```
Then publish the package using the following command.
```shellscript
aptos move publish --named-addresses hello_blockchain=default --assume-yes
```
Output
```shellscript
{
"Result": {
"transaction_hash": "0xe4ae0ec4ea3474b2123838885b04d7f4b046c174d14d7dc1c56916f2eb553bcf",
"gas_used": 1118,
"gas_unit_price": 100,
"sender": "dbcbe741d003a7369d87ec8717afb5df425977106497052f96f4e236372f7dd5",
"sequence_number": 5,
"success": true,
"timestamp_us": 1713914742422749,
"version": 1033819503,
"vm_status": "Executed successfully"
}
}
```
Notice that you do need to have your CLI profile set up properly and bind the named addresses correctly. Please refer to [CLI Configuration](/build/cli/setup-cli) for more details.
Note
Note: publishing the package to devnet/testnet is just one way to set up the stage for local simulation and is not the only one possible. Alternatively you can use a local node, or simulate transactions that do not need to have code published first, such as scripts and even the package publishing transaction itself.
## Local Simulation
[Section titled “Local Simulation”](#local-simulation)
Next, execute the entry function message::set\_message with local simulation enabled using the additional command line option `--local`. This will execute the transaction locally without conducting any further measurements or analysis.
```shellscript
aptos move run --function-id 'default::message::set_message' --args 'string:abc' --local
```
Output
```shellscript
Simulating transaction locally...
{
"Result": {
"transaction_hash": "0x5aab20980688185eed2c9a27bab624c84b8b8117241cd4a367ba2a012069f57b",
"gas_used": 441,
"gas_unit_price": 100,
"sender": "dbcbe741d003a7369d87ec8717afb5df425977106497052f96f4e236372f7dd5",
"success": true,
"version": 1033887414,
"vm_status": "status EXECUTED of type Execution"
}
}
```
Note
Local and remote simulation shall produce identical results.
## Benchmarking
[Section titled “Benchmarking”](#benchmarking)
To measure the running time(s) of your transaction, use the `--benchmark` option.
```shellscript
aptos move run --function-id 'default::message::set_message' --args 'string:abc' --benchmark
```
Output
```shellscript
Benchmarking transaction locally...
Running time (cold code cache): 985.141µs
Running time (warm code cache): 848.159µs
{
"Result": {
"transaction_hash": "0xa2fe548d37f12ee79df13e70fdd8212e37074c1b080b89b7d92e82550684ecdb",
"gas_used": 441,
"gas_unit_price": 100,
"sender": "dbcbe741d003a7369d87ec8717afb5df425977106497052f96f4e236372f7dd5",
"success": true,
"version": 1033936831,
"vm_status": "status EXECUTED of type Execution"
}
}
```
It’s worth noting that these running times serve only as informational references, as they are contingent upon the specifications of your local machine and may be influenced by noise or other random factors.
**If you are aiming to optimize your contract, you should base your decisions on the gas profiling results.**
Note
To minimize measurement errors, the benchmark harness executes the same transaction multiple times. For this reason, it may take a while for the benchmark task to complete.
## Gas Profiling
[Section titled “Gas Profiling”](#gas-profiling)
The Aptos Gas Profiler is a powerful tool that can help you understand the gas usage of Aptos transactions. Once activated, it will simulate transactions using an instrumented VM, and generate a web-based report.
The gas profiler can also double as a debugger since the report also includes a full execution trace.
### Using the Gas Profiler
[Section titled “Using the Gas Profiler”](#using-the-gas-profiler)
The gas profiler can be invoked by appending the `--profile-gas` option.
```shellscript
aptos move run --function-id 'default::message::set_message' --args 'string:abc' --profile-gas
```
Output
```shellscript
Simulating transaction locally using the gas profiler...
Gas report saved to gas-profiling/txn-d0bc3422-0xdbcb-message-set_message.
{
"Result": {
"transaction_hash": "0xd0bc342232f14a6a7d2d45251719aee45373bdb53f68403cfc6dc6062c74fa9e",
"gas_used": 441,
"gas_unit_price": 100,
"sender": "dbcbe741d003a7369d87ec8717afb5df425977106497052f96f4e236372f7dd5",
"success": true,
"version": 1034003962,
"vm_status": "status EXECUTED of type Execution"
}
}
```
You can then find the generated gas report in the directory `gas-profiling`:
* hello\_blockchain/
* Move.toml
* sources/
* …
* gas-profiling/
* txn-XXXXXXXX-0xXXXX-message-set\_message/
* assets/
* …
* index.html
`index.html` is the main page of the report, which can view using your web browser. [Sample report](/gas-profiling/sample-report/index.html)
### Understanding the Gas Report
[Section titled “Understanding the Gas Report”](#understanding-the-gas-report)
The gas report consists of three sections that help you to understand the gas usage through different lenses.
#### Flamegraphs
[Section titled “Flamegraphs”](#flamegraphs)
The first section consists of visualization of the gas usage in the form of two flamegraphs: one for execution & IO, the other for storage. The reason why we need two graphs is that these are measured in different units: one in gas units, and the other in APT.
It is possible to interact with various elements in the graph. If you hover your cursor over an item, it will show you the precise cost and percentage. 
If you click on an item, you can zoom into it and see the child items more clearly. You can reset the view by clicking the “Reset Zoom” button in the top-left corner. 
There is also “Search” button in the top-right corner that allows to match certain items and highlight them. 
#### Cost Break-down
[Section titled “Cost Break-down”](#cost-break-down)
The second section is a detailed break-down of all gas costs. Data presented in this section is categorized, aggregated and sorted. This can be especially helpful if you know what numbers to look at.
For example, the following tables show the execution costs of all Move bytecode instructions/operations. The percentage here is relative to the total cost of the belonging category (Exec + IO in this case).

#### Full Execution Trace
[Section titled “Full Execution Trace”](#full-execution-trace)
The final section of the gas report is the full execution trace of the transaction that looks like this:
```text
intrinsic 2.76 85.12%
dependencies 0.0607 1.87%
0xdbcb..::message 0.0607 1.87%
0xdbcb..::message::set_message 0.32416 10.00%
create_ty 0.0004 0.01%
create_ty 0.0004 0.01%
create_ty 0.0004 0.01%
create_ty 0.0004 0.01%
create_ty 0.0008 0.02%
imm_borrow_loc 0.00022 0.01%
call 0.00441 0.14%
0x1::signer::address_of 0.007534 0.23%
create_ty 0.0008 0.02%
move_loc 0.000441 0.01%
call 0.004043 0.12%
0x1::signer::borrow_address 0.000735 0.02%
read_ref 0.001295 0.04%
ret 0.00022 0.01%
st_loc 0.000441 0.01%
copy_loc 0.000854 0.03%
load<0xdbcb..::0xdbcb..::message::MessageHolder> 0.302385 9.33%
exists_generic 0.000919 0.03%
not 0.000588 0.02%
br_false 0.000441 0.01%
imm_borrow_loc 0.00022 0.01%
move_loc 0.000441 0.01%
pack 0.000955 0.03%
move_to_generic 0.001838 0.06%
branch 0.000294 0.01%
@28
ret 0.00022 0.01%
ledger writes 0.097756 3.01%
transaction
events
state write ops 0.097756 3.01%
create<0xdbcb..::0xdbcb..::message::MessageHolder> 0.097756 3.01%
```
The left column lists all Move instructions and operations being executed, with each level of indentation indicating a function call.
The middle column represents the gas costs associated with the operations.
There is also a special notation `@number` that represents a jump to a particular location in the byte code. (`@28` in the snippet above) This is purely informational and to help understand the control flow.
# Multisig Governance Tutorial
## Background
[Section titled “Background”](#background)
This section builds upon the [Arguments in JSON tutorial](/build/cli/working-with-move-contracts/arguments-in-json-tutorial). If you have not done that, please complete that tutorial first.
This tutorial likewise references the [`CliArgs` example package](https://github.com/aptos-labs/aptos-core/tree/main/aptos-move/move-examples/cli_args).
Note
If you would like to follow along, start by completing the [Arguments in JSON](/build/cli/working-with-move-contracts/arguments-in-json-tutorial) tutorial steps!
For this example, Ace and Bee will conduct governance operations from a 2-of-2 “multisig v2” account (an on-chain multisig account per [`multisig_account.move`](https://github.com/aptos-labs/aptos-core/blob/main/aptos-move/framework/aptos-framework/sources/multisig_account.move))
## Account creation
[Section titled “Account creation”](#account-creation)
Since Ace’s account was created during the [Arguments in JSON](/build/cli/working-with-move-contracts/arguments-in-json-tutorial) tutorial, start by mining a vanity address account for Bee too:
```shellscript
aptos key generate \
--vanity-prefix 0xbee \
--output-file bee.key
```
Output
```shellscript
{
"Result": {
"PublicKey Path": "bee.key.pub",
"PrivateKey Path": "bee.key",
"Account Address:": "0xbeec980219d246581cef5166dc6ba5fb1e090c7a7786a5176d111a9029b16ddc"
}
}
```
Note
The exact account address should vary for each run, though the vanity prefix should not.
Store Bee’s address in a shell variable, so you can call it inline later on:
```shellscript
# Your exact address should vary
bee_addr=0xbeec980219d246581cef5166dc6ba5fb1e090c7a7786a5176d111a9029b16ddc
```
Fund Bee’s account using the faucet:
```shellscript
aptos account fund-with-faucet --account $bee_addr
```
Output
```shellscript
{
"Result": "Added 100000000 Octas to account beec980219d246581cef5166dc6ba5fb1e090c7a7786a5176d111a9029b16ddc"
}
```
Ace can now create a multisig account:
```shellscript
aptos multisig create \
--additional-owners $bee_addr \
--num-signatures-required 2 \
--private-key-file ace.key \
--assume-yes
```
Output
```shellscript
{
"Result": {
"multisig_address": "57478da34604655c68b1dcb89e4f4a9124b6c0ecc1c59a0931d58cc4e60ac5c5",
"transaction_hash": "0x849cc756de2d3b57210f5d32ae4b5e7d1f80e5d376233885944b6f3cc2124a05",
"gas_used": 1524,
"gas_unit_price": 100,
"sender": "acef1b9b7d4ab208b99fed60746d18dcd74865edb7eb3c3f1428233988e4ba46",
"sequence_number": 5,
"success": true,
"timestamp_us": 1685078644186194,
"version": 528428043,
"vm_status": "Executed successfully"
}
}
```
Store the multisig address in a shell variable:
```shellscript
# Your address should vary
multisig_addr=0x57478da34604655c68b1dcb89e4f4a9124b6c0ecc1c59a0931d58cc4e60ac5c5
```
## Inspect the multisig
[Section titled “Inspect the multisig”](#inspect-the-multisig)
Use the assorted [`multisig_account.move` view functions](https://github.com/aptos-labs/aptos-core/blob/9fa0102c3e474d99ea35a0a85c6893604be41611/aptos-move/framework/aptos-framework/sources/multisig_account.move#L237) to inspect the multisig:
```shellscript
aptos move view \
--function-id 0x1::multisig_account::num_signatures_required \
--args \
address:"$multisig_addr"
```
Output
```shellscript
{
"Result": [
"2"
]
}
```
```shellscript
aptos move view \
--function-id 0x1::multisig_account::owners \
--args \
address:"$multisig_addr"
```
Output
```shellscript
{
"Result": [
[
"0xbeec980219d246581cef5166dc6ba5fb1e090c7a7786a5176d111a9029b16ddc",
"0xacef1b9b7d4ab208b99fed60746d18dcd74865edb7eb3c3f1428233988e4ba46"
]
]
}
```
```shellscript
aptos move view \
--function-id 0x1::multisig_account::last_resolved_sequence_number \
--args \
address:"$multisig_addr"
```
Output
```shellscript
{
"Result": [
"0"
]
}
```
```shellscript
aptos move view \
--function-id 0x1::multisig_account::next_sequence_number \
--args \
address:"$multisig_addr"
```
Output
```shellscript
{
"Result": [
"1"
]
}
```
## Enqueue a publication transaction
[Section titled “Enqueue a publication transaction”](#enqueue-a-publication-transaction)
The first multisig transaction enqueued will be a transaction for publication of the [`CliArgs` example package](https://github.com/aptos-labs/aptos-core/tree/main/aptos-move/move-examples/cli_args). First, generate a publication payload entry function JSON file:
```shellscript
aptos move build-publish-payload \
--named-addresses test_account=$multisig_addr \
--json-output-file publication.json \
--assume-yes
```
Output
```shellscript
{
"Result": "Publication payload entry function JSON file saved to publication.json"
}
```
Now have Ace propose publication of the package from the multisig account, storing only the payload hash on-chain:
```shellscript
aptos multisig create-transaction \
--multisig-address $multisig_addr \
--json-file publication.json \
--store-hash-only \
--private-key-file ace.key \
--assume-yes
```
Output
```shellscript
{
"Result": {
"transaction_hash": "0x70c75903f8e1b1c0069f1e84ef9583ad8000f24124b33a746c88d2b031f7fe2c",
"gas_used": 510,
"gas_unit_price": 100,
"sender": "acef1b9b7d4ab208b99fed60746d18dcd74865edb7eb3c3f1428233988e4ba46",
"sequence_number": 6,
"success": true,
"timestamp_us": 1685078836492390,
"version": 528429447,
"vm_status": "Executed successfully"
}
}
```
Note that the last resolved sequence number is still 0 because no transactions have been resolved:
```shellscript
aptos move view \
--function-id 0x1::multisig_account::last_resolved_sequence_number \
--args \
address:"$multisig_addr"
```
Output
```shellscript
{
"Result": [
"0"
]
}
```
However, the next sequence number has been incremented because a transaction has been enqueued:
```shellscript
aptos move view \
--function-id 0x1::multisig_account::next_sequence_number \
--args \
address:"$multisig_addr"
```
Output
```shellscript
{
"Result": [
"2"
]
}
```
The multisig transaction enqueued on-chain can now be inspected:
```shellscript
aptos move view \
--function-id 0x1::multisig_account::get_transaction \
--args \
address:"$multisig_addr" \
u64:1
```
Output
```shellscript
{
"Result": [
{
"creation_time_secs": "1685078836",
"creator": "0xacef1b9b7d4ab208b99fed60746d18dcd74865edb7eb3c3f1428233988e4ba46",
"payload": {
"vec": []
},
"payload_hash": {
"vec": [
"0x62b91159c1428c1ef488c7290771de458464bd665691d9653d195bc28e0d2080"
]
},
"votes": {
"data": [
{
"key": "0xacef1b9b7d4ab208b99fed60746d18dcd74865edb7eb3c3f1428233988e4ba46",
"value": true
}
]
}
}
]
}
```
Note from the above result that no payload is stored on-chain, and that Ace implicitly approved the transaction (voted `true`) upon the submission of the proposal.
## Enqueue a governance parameter transaction
[Section titled “Enqueue a governance parameter transaction”](#enqueue-a-governance-parameter-transaction)
Now have Bee enqueue a governance parameter setter transaction, storing the entire transaction payload on-chain:
```shellscript
aptos multisig create-transaction \
--multisig-address $multisig_addr \
--function-id $multisig_addr::cli_args::set_vals \
--type-args \
0x1::account::Account \
0x1::chain_id::ChainId \
--args \
u8:123 \
"bool:[false, true, false, false]" \
'address:[["0xace", "0xbee"], ["0xcad"], []]' \
--private-key-file bee.key \
--assume-yes
```
Output
```shellscript
{
"Result": {
"transaction_hash": "0xd0a348072d5bfc5a2e5d444f92f0ecc10b978dad720b174303bc6d91342f27ec",
"gas_used": 511,
"gas_unit_price": 100,
"sender": "beec980219d246581cef5166dc6ba5fb1e090c7a7786a5176d111a9029b16ddc",
"sequence_number": 0,
"success": true,
"timestamp_us": 1685078954841650,
"version": 528430315,
"vm_status": "Executed successfully"
}
}
```
Note the next sequence number has been incremented again:
```shellscript
aptos move view \
--function-id 0x1::multisig_account::next_sequence_number \
--args \
address:"$multisig_addr"
```
Output
```shellscript
{
"Result": [
"3"
]
}
```
Now both the publication and parameter transactions are pending:
```shellscript
aptos move view \
--function-id 0x1::multisig_account::get_pending_transactions \
--args \
address:"$multisig_addr"
```
Output
```shellscript
{
"Result": [
[
{
"creation_time_secs": "1685078836",
"creator": "0xacef1b9b7d4ab208b99fed60746d18dcd74865edb7eb3c3f1428233988e4ba46",
"payload": {
"vec": []
},
"payload_hash": {
"vec": [
"0x62b91159c1428c1ef488c7290771de458464bd665691d9653d195bc28e0d2080"
]
},
"votes": {
"data": [
{
"key": "0xacef1b9b7d4ab208b99fed60746d18dcd74865edb7eb3c3f1428233988e4ba46",
"value": true
}
]
}
},
{
"creation_time_secs": "1685078954",
"creator": "0xbeec980219d246581cef5166dc6ba5fb1e090c7a7786a5176d111a9029b16ddc",
"payload": {
"vec": [
"0x0057478da34604655c68b1dcb89e4f4a9124b6c0ecc1c59a0931d58cc4e60ac5c508636c695f61726773087365745f76616c7302070000000000000000000000000000000000000000000000000000000000000001076163636f756e74074163636f756e740007000000000000000000000000000000000000000000000000000000000000000108636861696e5f696407436861696e49640003017b0504000100006403020000000000000000000000000000000000000000000000000000000000000ace0000000000000000000000000000000000000000000000000000000000000bee010000000000000000000000000000000000000000000000000000000000000cad00"
]
},
"payload_hash": {
"vec": []
},
"votes": {
"data": [
{
"key": "0xbeec980219d246581cef5166dc6ba5fb1e090c7a7786a5176d111a9029b16ddc",
"value": true
}
]
}
}
]
]
}
```
## Execute the publication transaction
[Section titled “Execute the publication transaction”](#execute-the-publication-transaction)
Since only Ace has voted on the publication transaction (which he implicitly approved upon proposing) the transaction can’t be executed yet:
```shellscript
aptos move view \
--function-id 0x1::multisig_account::can_be_executed \
--args \
address:"$multisig_addr" \
u64:1
```
Output
```shellscript
{
"Result": [
false
]
}
```
Before Bee votes, however, she verifies that the payload hash stored on-chain matches the publication entry function JSON file:
```shellscript
aptos multisig verify-proposal \
--multisig-address $multisig_addr \
--json-file publication.json \
--sequence-number 1
```
Output
```shellscript
{
"Result": {
"Status": "Transaction match",
"Multisig transaction": {
"creation_time_secs": "1685078836",
"creator": "0xacef1b9b7d4ab208b99fed60746d18dcd74865edb7eb3c3f1428233988e4ba46",
"payload": {
"vec": []
},
"payload_hash": {
"vec": [
"0x62b91159c1428c1ef488c7290771de458464bd665691d9653d195bc28e0d2080"
]
},
"votes": {
"data": [
{
"key": "0xacef1b9b7d4ab208b99fed60746d18dcd74865edb7eb3c3f1428233988e4ba46",
"value": true
}
]
}
}
}
}
```
Since Bee has verified that the on-chain payload hash checks out against her locally-compiled package publication JSON file, she votes yes:
```shellscript
aptos multisig approve \
--multisig-address $multisig_addr \
--sequence-number 1 \
--private-key-file bee.key \
--assume-yes
```
Output
```shellscript
{
"Result": {
"transaction_hash": "0xa5fb49f1077de6aa6d976e6bcc05e4c50c6cd061f1c87e8f1ea74e7a04a06bd1",
"gas_used": 6,
"gas_unit_price": 100,
"sender": "beec980219d246581cef5166dc6ba5fb1e090c7a7786a5176d111a9029b16ddc",
"sequence_number": 1,
"success": true,
"timestamp_us": 1685079892130861,
"version": 528437204,
"vm_status": "Executed successfully"
}
}
```
Now the transaction can be executed:
```shellscript
aptos move view \
--function-id 0x1::multisig_account::can_be_executed \
--args \
address:"$multisig_addr" \
u64:1
```
Output
```shellscript
{
"Result": [
true
]
}
```
Now either Ace or Bee can invoke the publication transaction from the multisig account, passing the full transaction payload since only the hash was stored on-chain:
```shellscript
aptos multisig execute-with-payload \
--multisig-address $multisig_addr \
--json-file publication.json \
--private-key-file bee.key \
--max-gas 10000 \
--assume-yes
```
Note
Pending the resolution of [#8304](https://github.com/aptos-labs/aptos-core/issues/8304), the transaction simulator (which is used to estimate gas costs) is broken for multisig transactions, so you will have to manually specify a max gas amount.
Output
Also pending the resolution of [#8304](https://github.com/aptos-labs/aptos-core/issues/8304), the CLI output for a successful multisig publication transaction execution results in an API error if only the payload hash has been stored on-chain, but the transaction can be manually verified using an explorer.
## Execute the governance parameter transaction
[Section titled “Execute the governance parameter transaction”](#execute-the-governance-parameter-transaction)
Since only Bee has voted on the governance parameter transaction (which she implicitly approved upon proposing), the transaction can’t be executed yet:
```shellscript
aptos move view \
--function-id 0x1::multisig_account::can_be_executed \
--args \
address:"$multisig_addr" \
u64:2
```
Output
```shellscript
{
"Result": [
false
]
}
```
Before Ace votes, however, he verifies that the payload stored on-chain matches the function arguments he expects:
```shellscript
aptos multisig verify-proposal \
--multisig-address $multisig_addr \
--function-id $multisig_addr::cli_args::set_vals \
--type-args \
0x1::account::Account \
0x1::chain_id::ChainId \
--args \
u8:123 \
"bool:[false, true, false, false]" \
'address:[["0xace", "0xbee"], ["0xcad"], []]' \
--sequence-number 2
```
Output
```shellscript
{
"Result": {
"Status": "Transaction match",
"Multisig transaction": {
"creation_time_secs": "1685078954",
"creator": "0xbeec980219d246581cef5166dc6ba5fb1e090c7a7786a5176d111a9029b16ddc",
"payload": {
"vec": [
"0x0057478da34604655c68b1dcb89e4f4a9124b6c0ecc1c59a0931d58cc4e60ac5c508636c695f61726773087365745f76616c7302070000000000000000000000000000000000000000000000000000000000000001076163636f756e74074163636f756e740007000000000000000000000000000000000000000000000000000000000000000108636861696e5f696407436861696e49640003017b0504000100006403020000000000000000000000000000000000000000000000000000000000000ace0000000000000000000000000000000000000000000000000000000000000bee010000000000000000000000000000000000000000000000000000000000000cad00"
]
},
"payload_hash": {
"vec": []
},
"votes": {
"data": [
{
"key": "0xbeec980219d246581cef5166dc6ba5fb1e090c7a7786a5176d111a9029b16ddc",
"value": true
}
]
}
}
}
}
```
Note that the verification fails if he modifies even a single argument:
```shellscript
aptos multisig verify-proposal \
--multisig-address $multisig_addr \
--function-id $multisig_addr::cli_args::set_vals \
--type-args \
0x1::account::Account \
0x1::chain_id::ChainId \
--args \
u8:200 \
"bool:[false, true, false, false]" \
'address:[["0xace", "0xbee"], ["0xcad"], []]' \
--sequence-number 2
```
Output
```shellscript
{
"Error": "Unexpected error: Transaction mismatch: The transaction you provided has a payload hash of 0xe494b0072d6f940317344967cf0e818c80082375833708c773b0275f3ad07e51, but the on-chain transaction proposal you specified has a payload hash of 0x070ed7c3f812f25f585461305d507b96a4e756f784e01c8c59901871267a1580. For more info, see https://aptos.dev/move/move-on-aptos/cli#multisig-governance"
}
```
Ace approves the transaction:
```shellscript
aptos multisig approve \
--multisig-address $multisig_addr \
--sequence-number 2 \
--private-key-file ace.key \
--assume-yes
```
Output
```shellscript
{
"Result": {
"transaction_hash": "0x233427d95832234fa13dddad5e0b225d40168b4c2c6b84f5255eecc3e68401bf",
"gas_used": 6,
"gas_unit_price": 100,
"sender": "acef1b9b7d4ab208b99fed60746d18dcd74865edb7eb3c3f1428233988e4ba46",
"sequence_number": 7,
"success": true,
"timestamp_us": 1685080266378400,
"version": 528439883,
"vm_status": "Executed successfully"
}
}
```
Since the payload was stored on-chain, it is not required to execute the pending transaction:
```shellscript
aptos multisig execute \
--multisig-address $multisig_addr \
--private-key-file ace.key \
--max-gas 10000 \
--assume-yes
```
Output
```shellscript
{
"Result": {
"transaction_hash": "0xbc99f929708a1058b223aa880d04607a78ebe503367ec4dab23af4a3bdb541b2",
"gas_used": 505,
"gas_unit_price": 100,
"sender": "acef1b9b7d4ab208b99fed60746d18dcd74865edb7eb3c3f1428233988e4ba46",
"sequence_number": 8,
"success": true,
"timestamp_us": 1685080344045461,
"version": 528440423,
"vm_status": "Executed successfully"
```
# create-aptos-dapp
`create-aptos-dapp` builds a template project for dapp developers to easily create a front-end and a smart contract on the Aptos network.
## Why use create-aptos-dapp?
[Section titled “Why use create-aptos-dapp?”](#why-use-create-aptos-dapp)
* **Templated Setup**: `create-aptos-dapp` generates predefined end-to-end dapp templates and configuration files for you. It saves manual setup of the project structure, which can be time-consuming and error-prone.
* **Contract Directory:** `create-aptos-dapp` generates a `contract` directory that includes the basic structure for Move smart contract modules.
* **Best Practices**: `create-aptos-dapp` incorporates best practices and structure recommendations to develop for the Aptos network.
* **Built-in Move Commands**: `create-aptos-dapp` includes built-in commands for common tasks, such as initializing the Move compiler, compiling, and publishing smart contracts on-chain.
## Prerequisites
[Section titled “Prerequisites”](#prerequisites)
* [node and npm](https://nodejs.org/en) (npm ≥ 5.2.0)
* [Python 3.6+](https://www.python.org/)
## Using `create-aptos-dapp`
[Section titled “Using create-aptos-dapp”](#using-create-aptos-dapp)
1. Navigate to the directory you want to work in.
```shellscript
cd your/workspace
```
2. Install create-aptos-dapp.
* npx
```shellscript
npx create-aptos-dapp@latest
```
* pnpx
```shellscript
pnpx create-aptos-dapp@latest
```
* yarn
```shellscript
yarn create aptos-dapp
```
* pnpm
```shellscript
pnpm create create-aptos-dapp@latest
```
3. Follow the CLI prompts.
After installing, you will need to answer several questions about your project including:
1. The project’s name
2. Which template to use ([see below](#current-templates))
3. Whether to use Mainnet or Devnet for testing

## Templates
[Section titled “Templates”](#templates)
`create-aptos-dapp` provides you with premade end-to-end dapp templates, i.e. a ready dapp with configurations and a beautiful UI to get you started with creating a dapp on Aptos.
The goals of the templates are to:
1. Familiarize users with different Aptos Standards by having an end-to-end dapp template examples.
2. Educate users on how to build a dapp on Aptos from the front-end layer to the smart contract layer and how everything in-between.
3. Provide users with pre-made templates to quickly deploy simple dapps
### Current Templates
[Section titled “Current Templates”](#current-templates)
All current templates are available on [Aptos Learn](https://learn.aptoslabs.com/en/dapp-templates). Read more about specific templates below:
* [Boilerplate Template](https://learn.aptoslabs.com/en/dapp-templates/boilerplate-template)
* [NFT minting dapp Template](https://learn.aptoslabs.com/en/dapp-templates/nft-minting-template)
* [Token minting dapp Template](https://learn.aptoslabs.com/en/dapp-templates/token-minting-template)
* [Token staking dapp Template](https://learn.aptoslabs.com/en/dapp-templates/token-staking-template)
* [Custom indexer template](https://learn.aptoslabs.com/en/dapp-templates/custom-indexer-template)
## Tools `create-aptos-dapp` utilizes
[Section titled “Tools create-aptos-dapp utilizes”](#tools-create-aptos-dapp-utilizes)
* React framework
* Vite development tool
* shadcn/ui + tailwind for styling
* Aptos TS SDK
* Aptos Wallet Adapter
* Node based Move commands
# Create Aptos Dapp FAQ
## Why do we use `import.meta.env`?
[Section titled “Why do we use import.meta.env?”](#why-do-we-use-importmetaenv)
The template is built in a way that there are pages meant to be accessed only on DEV mode and pages that are meant to be accessed also on PROD mode. For example, “create collection” and “my collections” pages are only meant for local development and can only be accessed on DEV mode while the “public mint” page can be accessed on PROD mode. `import.meta.env` is the `Vite` way to know what is the environment the dapp is running on - DEV or PROD.
## I tried to publish my dapp to a live server but getting `404 error`
[Section titled “I tried to publish my dapp to a live server but getting 404 error”](#i-tried-to-publish-my-dapp-to-a-live-server-but-getting-404-error)
Might need to update the root route, if you deployed your site to `user-name.github.io/my-repo` then root route should be updated to `my-repo`
## What is Tailwind CSS?
[Section titled “What is Tailwind CSS?”](#what-is-tailwind-css)
Tailwind is a utility-first CSS framework that scans your components for class names and generates a static CSS file containing the corresponding styles at build-time.
This framework makes it easy to quickly author styles that are co-located with your component markup without incurring any runtime performance costs. It also helps you to maintain a consistent theme throughout your app that is responsive to light and dark mode.
To learn more about Tailwind CSS, please refer to their official [documentation](https://tailwindcss.com/docs/utility-first).
## What is `shadcn/ui`?
[Section titled “What is shadcn/ui?”](#what-is-shadcnui)
Shadcn is a collection of accessible components that you can copy and paste into your app through their CLI tool. Since the source files live in your app’s codebase, you can customize them as much as you need to.
These components are built on top of [Radix UI Primitives](https://www.radix-ui.com/primitives) and are styled with [Tailwind CSS](https://tailwindcss.com/). To learn more about `shadcn/ui`, please refer to their official [documentation](https://ui.shadcn.com/docs).
## How to modify the theme?
[Section titled “How to modify the theme?”](#how-to-modify-the-theme)
The theme for this template is split across `tailwind.config.js` and `frontend/index.css`. The Tailwind config declares all of the theme colors, text styles, animation keyframes, border radii, etc. The root CSS file (`index.css`) declares the actual color values for light and dark mode as CSS custom properties (CSS variables), the base radius value, and applies any global CSS required.
For example, if you want to make all of the buttons and cards more round in your app, you can increase the base radius value (`--radius`) in `index.css`.
If you want to add a new text style, you can define it in the `addTextStyles` function towards the end of `tailwind.config.js`.
And if you want to modify the primary color of the app, you can update the HSL color values defined in `index.css`.
## How to add components?
[Section titled “How to add components?”](#how-to-add-components)
Additional components can be added through the `shadcn-ui` CLI. For example, if you wish to add a `Switch` component, you can run the following command:
```shellscript
npx shadcn-ui@latest add switch
```
This command will create a `switch.tsx` file in your `frontend/components/ui` directory that contains a styled switch component. For a full list of available shadcn components, please refer to the [shadcn component documentation](https://ui.shadcn.com/docs/components).
If you need to add a component that’s not included in the `shadcn/ui` collection, you’re welcome to add your own components under `frontend/components` or within the `frontend/pages` directory if they’re specific to the page that you’re working on.
## How to add colors?
[Section titled “How to add colors?”](#how-to-add-colors)
If you’re creating your own custom components or adding to the UI in some way, you may need to add some new colors. To add a new color, you must first define the light and dark HSL color values in `frontend/index.css` and then add the new theme color token to the theme defined in `tailwind.config.js`.
For more detailed instructions, please refer to the [shadcn documentation on theming](https://ui.shadcn.com/docs/theming).
## How to add dark mode?
[Section titled “How to add dark mode?”](#how-to-add-dark-mode)
In an effort to maintain simplicity in the dapp template, only light mode is set up. However, color values are defined for both light and dark mode in the theme. If you wish to add dark mode to your app, you simply have to add the shadcn `ThemeProvider` and `ModeToggle` to your app. Once added, the UI will be fully responsive to both light and dark mode. For detailed instructions on how to achieve this, please refer to the [shadcn dark mode documentation](https://ui.shadcn.com/docs/dark-mode/vite).
# Get Started
Content for build/get-started could not be fully rendered due to component compatibility issues.
# Developer Setup
Here is an easy way to setup your environment depending on the type of development.
* Frontend
1. Initialize Frontend Project
Here are some examples of popular choices:
* Next.js
```shellscript
pnpx create-next-app@latest
```
* Vite (TS)
```shellscript
pnpx create vite my-aptos-app --template react-ts
```
2. Install @aptos-labs/ts-sdk
```shellscript
npm i @aptos-labs/ts-sdk
```
3. Setup TS SDK
[TS SDK Quickstart ](/build/sdks/ts-sdk/quickstart)See how to setup your account, network, use the faucet, send / simulate transactions, and more
4. Build your app!
The developer setup for using Aptos in your frontend is now complete. Checkout our other tools that streamline the development process
[Indexer ](/build/indexer)Efficiently query for on-chain state like balances, transaction activity, token data, and more
[TS SDK Examples ](https://github.com/aptos-labs/aptos-ts-sdk/tree/main/examples/typescript)20+ Examples of how to use the TS SDK
[Aptos Build ](https://build.aptoslabs.com/)Hitting rate limits for Fullnode API / Indexers? Get an API Key here
* Smart Contract
```shellscript
pnpx create-next-app@latest
```
* Create Aptos Dapp
```shellscript
pnpx create vite my-aptos-app --template react-ts
```
* Next.js
1. Install CLI
[Aptos CLI ](/build/cli)Instructions for how to install Aptos CLI
2. Setup Editor or IDE
Add the following extensions to your editor of choice to make Move Development easier
* JetBrains IDE
[Move Language ](https://plugins.jetbrains.com/plugin/14721-move-language)Language server and syntax highlighter for JetBrains IDEs like CLion, Rust Rover, WebStorm, IntelliJ
* VSCode
[Aptos Move Analyzer ](https://marketplace.visualstudio.com/items?itemName=MoveBit.aptos-move-analyzer)Language server and syntax highlighter for VSCode
3. Create Smart Contract
Navigate to your application folder and initialize a new smart contract by doing:
```shellscript
aptos move init --name my_todo_list
```
4. Build, Compile, and Deploy Smart Contract!
The developer setup for using Aptos for smart contracts is now complete. For more info see the link to the Dapp tutorial below
[Create Smart Contract Guide ](/build/guides/build-e2e-dapp/1-create-smart-contract#what-is-a-movetoml-file)An easy todo list guide for how to setup a smart contract with Move
* Vite (TS)
[Move Language ](https://plugins.jetbrains.com/plugin/14721-move-language)Language server and syntax highlighter for JetBrains IDEs like CLion, Rust Rover, WebStorm, IntelliJ
* JetBrains IDE
[Aptos Move Analyzer ](https://marketplace.visualstudio.com/items?itemName=MoveBit.aptos-move-analyzer)Language server and syntax highlighter for VSCode
* VSCode
1. Install create-aptos-dapp
Run the below command to install a dApp from a template in seconds:
* npx
```shellscript
npx create-aptos-dapp@latest
```
* pnpx
```shellscript
pnpx create-aptos-dapp@latest
```
2. Follow the prompts
Follow the CLI prompts to select a name, [template](/build/create-aptos-dapp#templates), and network for your new dApp.

3. Start building and customizing your new dApp!
Navigate to your new project and open in your favorite IDE to continue building!
Follow the generated `README.md` file for next steps.
4. Continue reading
[Create Aptos Dapp ](/build/create-aptos-dapp)Get more information about the tool
[Templates ](/build/create-aptos-dapp#templates)Browse premade templates
[FAQ ](/build/create-aptos-dapp/faq)Get help for common issues and questions
* npx
```shellscript
npx create-aptos-dapp@latest
```
* pnpx
```shellscript
pnpx create-aptos-dapp@latest
```
# Ethereum to Aptos Cheatsheet
To learn more about the differences and similarities see [Aptos Learn](https://learn.aptoslabs.com/en/tutorials/ethereum-to-aptos-guide/cheat-sheet?workshop=eth-to-aptos)
### High Level Overview
[Section titled “High Level Overview”](#high-level-overview)
| Feature | Ethereum | Aptos |
| -------------------------- | ------------------------------------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------- |
| **Smart Contracts** | Solidity, EVM | Move, MoveVM |
| **Benefits** | Mature, wide adoption | Scalability, low latency, predictable fees |
| **Transaction Fees** | Variable, can be high | Lower and more predictable |
| **Account Addresses** | 160-bit | 256-bit |
| **Account Structure** | Balance in a single field, uses nonce | Modules and resources, uses sequence number |
| **Data Storage** | Patricia Merkle Trees | Global storage with resources and modules |
| **Storage Mindset** | Contract-based storage | Account centric mindset for code and data |
| **Example Code** | [ERC-20](https://github.com/OpenZeppelin/openzeppelin-contracts/tree/master/contracts/token/ERC20) | [Fungible Asset](https://github.com/aptos-labs/aptos-core/blob/main/aptos-move/framework/aptos-framework/sources/fungible_asset.move) |
| **Caller ID** | `msg.sender` | `&signer` reference |
| **Upgradeability** | Proxy patterns | Direct module upgrades |
| **Safety & Security** | Vulnerable to attacks like reentrancy | Mitigates common vulnerabilities |
| **Dispatch Type** | Dynamic dispatch | Static dispatch |
| **FT Standard** | [ERC-20](https://docs.openzeppelin.com/contracts/4.x/erc20) | [Coin](/build/smart-contracts/aptos-coin) (legacy) and [Fungible Asset](/build/smart-contracts/fungible-asset) |
| **NFT Standards** | [ERC-721](https://docs.openzeppelin.com/contracts/4.x/erc721), [ERC-1155](https://docs.openzeppelin.com/contracts/4.x/erc1155) | [Digital Asset](/build/smart-contracts/digital-asset) |
| **Blockchain Interaction** | [Ethers.js library](https://docs.ethers.org/v6/) | [Aptos Typescript SDK](/build/sdks/ts-sdk) |
### Comparing Token Standards in Detail
[Section titled “Comparing Token Standards in Detail”](#comparing-token-standards-in-detail)
| | Solidity | Move (Aptos) |
| ---------------------- | ----------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| **Token Structure** | Each token is its own contract. | Every token is a typed `Coin` or `FungibleAsset` using a single, reusable contract. |
| **Token Standard** | Must conform to standards like ERC20; implementations can vary. | Uniform interface and implementation for all tokens. |
| **Balance Storage** | Balances stored in contract using a mapping structure. | **Resource-Oriented Balance**: Balances stored as a resource in the user’s account. Resources cannot be arbitrarily created, ensuring integrity of token value. |
| **Transfer Mechanism** | Tokens can be transferred without receiver’s explicit permission. | Except for specific cases (like AptosCoin), Tokens generally require receiver’s `signer` authority for transfer. |
### Comparing EVM and Move VM in Detail
[Section titled “Comparing EVM and Move VM in Detail”](#comparing-evm-and-move-vm-in-detail)
* **EVM**: Known for its flexibility and dynamic dispatch, which allows a wide range of smart contract behaviors. This flexibility, however, can lead to complexities in parallel execution and network operations.
* **Move VM**: Focuses on safety and efficiency with a more integrated approach between the VM and the programming language. Its data storage model allows for better parallelization, and its static dispatch method enhances security and predictability.
| | EVM (Ethereum Virtual Machine) | Move VM (Move Virtual Machine) |
| ------------------------------- | ---------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------ |
| **Data Storage** | Data is stored in the smart contract’s storage space. | Data is stored across smart contracts, user accounts, and objects. |
| **Parallelization** | Parallel execution is limited due to shared storage space. | More parallel execution enabled due to flexible split storage design. |
| **VM and Language Integration** | Separate layers for EVM and smart contract languages (e.g., Solidity). | Seamless integration between VM layer and Move language, with native functions written in Rust executable in Move. |
| **Critical Network Operations** | Implementation of network operations can be complex and less direct. | Critical operations like validator set management natively implemented in Move, allowing for direct execution. |
| **Function Calling** | Dynamic dispatch allows for arbitrary smart contract calls. | Static dispatch aligns with a focus on security and predictable behavior. |
| **Type Safety** | Contract types provide a level of type safety. | Module structs and generics in Move offer robust type safety. |
| **Transaction Safety** | Uses nonces for transaction ordering and safety. | Uses sequence numbers for transaction ordering and safety. |
| **Authenticated Storage** | Yes, with smart contract storage. | Yes, leveraging Move’s resource model. |
| **Object Accessibility** | Objects are not globally accessible; bound to smart contract scope. | Guaranteed global accessibility of objects. |
# Solana to Aptos Cheatsheet
To learn more about the differences and similarities see [Aptos Learn](https://learn.aptoslabs.com/en/tutorials/solana-to-aptos-guide/cheat-sheet?workshop=solana-to-aptos)
| | Solana | Aptos |
| ---------------------------- | ------------------------------------------------------------------------------------------------------------------ | ----------------------------------------------------------------------------------------------------------------------------------------- |
| **Smart Contracts** | Rust, SVM | Move, MoveVM |
| **Transaction Fees** | Low | Low |
| **Parallelization** | Pessimistic parallelism, need to declare all write accounts | Optimistic parallelism, chain infers write accounts for you |
| **Contract Account Support** | PDA Account | [Object](/build/smart-contracts/objects) or [resource account](/build/smart-contracts/resource-accounts)(encourage to use object instead) |
| **Data Storage** | Data stored in account owned by programs | Data stored as resource under user account or object |
| **Storage Level** | Program level | Global when stored under object |
| **Storage Mindset** | User data stored distributedly under account | User data stored distributedly under object |
| **Example Code** | [Todo list contract on Solana](https://github.com/aptos-labs/move-by-examples/tree/main/advanced-todo-list/solana) | [Todo list contract on Aptos](https://github.com/aptos-labs/move-by-examples/tree/main/advanced-todo-list/aptos) |
| **Caller ID** | `signer` | `signer` |
| **Upgradability** | Program is upgradable | Module is upgradable |
| **Dispatch Type** | Static dispatch | Static dispatch |
| **FT Standards** | Token program | [Coin](/build/smart-contracts/aptos-coin) (legacy) and [Fungible Asset Standard](/build/smart-contracts/fungible-asset) |
| **NFT Standards** | Token program | [Digital Asset Standard](/build/smart-contracts/digital-asset) |
| **Blockchain Interaction** | Solana web3.js library | [Aptos Typescript SDK](/build/sdks/ts-sdk) |
# Learn from Guides
Choose one of the following guides to learn how to use Aptos for your use cases!
[Your First Transaction ](/build/guides/first-transaction)How to generate, submit and verify a transaction to the Aptos blockchain.
[Your First NFT ](/build/guides/your-first-nft)Learn the Aptos Ttoken interface and how to use it to generate your first NFT.
[Your First Fungible Asset ](/build/guides/first-fungible-asset)Learn how to deploy and manage a fungible asset.
[Your First Coin ](/build/guides/first-coin)Learn how to deploy and manage a coin.
[Your First Move Module ](/build/guides/first-move-module)Write your first Move module for the Aptos blockchain.
[Your First Dapp ](/build/guides/build-e2e-dapp)Learn how to build your first dapp. Focuses on building the user interface for the dapp.
[Your First Multisig ](/build/guides/first-multisig)Learn how to perform assorted operations using K-of-N multi-signer authentication.
# Aptos Keyless
## Integrate with Aptos Keyless accounts
[Section titled “Integrate with Aptos Keyless accounts”](#integrate-with-aptos-keyless-accounts)
* [Introduction](/build/guides/aptos-keyless/introduction)
* [OIDC Support and Configuration](/build/guides/aptos-keyless/oidc-support)
* [Integration Guide](/build/guides/aptos-keyless/integration-guide)
* [Simple Example](/build/guides/aptos-keyless/simple-example)
* [How Aptos Keyless works](/build/guides/aptos-keyless/how-keyless-works)
* [Terminology and FAQ](/build/guides/aptos-keyless/other)
## Using an IAM Provider? Integrate with Aptos Federated Keyless
[Section titled “Using an IAM Provider? Integrate with Aptos Federated Keyless”](#using-an-iam-provider-integrate-with-aptos-federated-keyless)
* [Federated Keyless](/build/guides/aptos-keyless/federated-keyless)
## Example
[Section titled “Example”](#example)
Visit this page to learn more [Simple Example](/build/guides/aptos-keyless/simple-example)
[aptos-keyless-example](https://stackblitz.com/edit/vitejs-vite-3fuvtu?embed=1\&file=README.md)
# Federated Keyless
## Federated Keyless
[Section titled “Federated Keyless”](#federated-keyless)
[AIP-96](https://github.com/aptos-foundation/AIPs/blob/main/aips/aip-96.md): Federated Keyless is an extension of Aptos Keyless to support more OpenID Connect (OIDC) providers, beyond the ones that are allow-listed in `0x1::jwks` via JWK consensus, while maintaining its decentralization. Federated keyless adds support for authenticating users via identity & access management (IAM) providers (e.g. Auth0, AWS Cognito) as long as your project uses a supported IAM provider for user authentication.
To elaborate further, Federated Keyless enables:
1. Extension of authentication methods a. All authentication methods supported by the IAM are available to the dApp including email/SMS OTP and their marketplace of social login integrations like Discord, Naver, X and more. Auth0 marketplace linked [here](https://marketplace.auth0.com/) as an example.
2. Compatibility with existing account systems a. Since IAMs also support custom authentication, it allows an application to bring its own username/password (Cognito [docs](https://docs.aws.amazon.com/cognito/latest/developerguide/amazon-cognito-user-pools-authentication-flow.html), Auth0 [docs](https://auth0.com/blog/Custom-Authentication-With-Auth0/)). An application can start using an existing account system already set up with an IAM or they can migrate their existing account system to an IAM to generate Keyless-compatible JWTs.
* [Federated Keyless Key Considerations](/build/guides/aptos-keyless/federated-keyless/key-considerations)
* [Federated Keyless Integration Guide](/build/guides/aptos-keyless/federated-keyless/integration-guide)
* [Federated Keyless FAQs](/build/guides/aptos-keyless/federated-keyless/other)
# Federated Keyless Integration Guide
1. Step 1. Setup your IAM provider
Set up your project with your IAM to match the account structure you are looking for.
* [Getting Started with AWS Cognito](https://aws.amazon.com/cognito/getting-started/)
* [Getting Started with Auth0](https://auth0.com/docs/get-started)
2. Step 2. Register the JSON Web Key Set (JWKS) on-chain
Federated Keyless accounts require the JWKS to be registered on-chain.
To register the JWKS - call the `0x1::jwks::update_federated_jwk_set` entry function with an Aptos account that will store the JWKs that will be used to validate transactions signed by federated keyless accounts.
Caution
**Losing access to the JWK owner account compromises the Federated Keyless accounts created with it**
The JWK owner account is the only account that can update the JWKS. If you lose access to the JWK owner account, you will not be able to update the JWKS and the Federated Keyless accounts created with it will stop working in the case of a key rotation. Users will be unable to validate their JWT tokens as they will be signed with the new key whos public key is not registered on the Aptos blockchain.
The JWK set can be found as follows -
AWS Cognito - `https://cognito-idp..amazonaws.com//.well-known/jwks.json` Auth0 - `https:///.well-known/jwks.json`
The typescript SDK contains functionality to simplify the process given the issuer for your IAM provider setup (the `iss` claim value on your user’s JWT tokens) and an account to use to make the update.
```tsx
import {Aptos} from '@aptos-labs/ts-sdk'; // Requires version v1.29.1 or later
const aptos = new Aptos(new AptosConfig({ network: Network.DEVNET })); // Configure your network here
const alice = // Derive your Aptos account here
const jwkTxn = await aptos.updateFederatedKeylessJwkSetTransaction({ sender: alice, iss });
await aptos.signAndSubmitTransaction({ signer: alice, transaction: jwkTxn });
```
You can use the interactive example provided by the SDK to easily register the JWKS for your IAM provider in devnet or testnet. This will setup the JWK owner account with a Google Keyless account.
```shellscript
git clone https://github.com/aptos-labs/aptos-ts-sdk
cd aptos-ts-sdk
pnpm install && pnpm build
cd examples/typescript
pnpm install
pnpm jwk_update
```
To setup the JWK owner account in mainnet, you will need create an account and use it to register the JWKS.
Save the address of the account you used to register the JWKS as you will need it for the next step.
To learn more about the `0x1::jwks::update_federated_jwk_set` entry function, see the [reference documentation](https://aptos.dev/en/build/smart-contracts/move-reference?page=aptos-framework%2Fdoc%2Fjwks.md#0x1_jwks_update_federated_jwk_set).
Caution
**Handling key rotations**
Whenever there is a key rotation of the JWKS, it is important to update the JWKS registered on chain promptly to avoid any loss of access to Federated Keyless accounts. See [here](/build/guides/aptos-keyless/federated-keyless/key-considerations) for more info.
3. Step 3. Follow the Aptos Keyless integration guide
Now that you have registered the JWKS, you can follow the Aptos Keyless integration guide starting from step 2. Be sure to set the `jwkAddress` to the address of the account you used to register the JWKS when deriving the `KeylessAccount`.
[Aptos Keyless Integration Guide - Step 2](/build/guides/aptos-keyless/integration-guide#step-2-install-the-aptos-typescript-sdk)
# Federated Keyless Key Considerations
## Federated Keyless Key Considerations
[Section titled “Federated Keyless Key Considerations”](#federated-keyless-key-considerations)
**Supported IAMs**
Currently, the supported IAMs are Amazon Cognito and Auth0 across devnet, testnet, and mainnet. See a table of the full set of supported IAM providers [here](/build/guides/aptos-keyless/oidc-support).
**Federated Keyless flow**
The flow for Federated Keyless transactions is the same as described [here](/build/guides/aptos-keyless/how-keyless-works). However, the difference is that in Federated Keyless, instead of the OIDC provider (e.g., Google, Apple) acting as the issuer of the JWT, the IAM provider (e.g., Auth0, Cognito) acts as the issuer. The user authenticates with the application, the IAM receives the user’s credentials, and then the IAM issues the Keyless-compatible JWT.
**Available authentication methods**
All authentication methods that are supported by the IAM providers are available for use - this includes SMS OTP, email link, and the traditional username + password.
**Configuration limitations**
A Keyless account address varies according to the `aud` (AKA application ID or client ID), and `iss` (AKA issuer). The setup of your user data within the IAM must reflect the interoperability you seek to provide to your users. JWT tokens issued for a user in the same user pool but for different applications will result in a different address derivation if the `aud` value is different.
**JSON Web Key Set management**
If you or the IAM platform rotates the key pairs used to signed the JWT tokens, the JWK set must be updated on chain using the same account used to instantiate your app’s Federated Keyless accounts. As such it is vital to -
1. Maintain access to your JWKS owner account
2. Update the JWK set on chain whenever a key rotation occurs
When a keypair is rotated existing keyless account instantiations will continue to work so long as the old JWK has not been removed. Any new JWTs issued by the new keypair will not be accepted until the JWK set on chain is updated to contain its public key.
**The trust and security model for Federated Keyless**
Compared to the existing Keyless implementation, dApp developers utilizing Federated Keyless alongside certain authentication methods like email/SMS, OTP and email/password may have more access to user credentials when leveraging IAM providers than with the existing direct OIDC provider integrations.
We recommend each dApp developer perform their own research and consult with their legal counsel before integrating an authentication method. Developers should also understand to what extent they may have access to user credentials and what controls they have in place.
# FAQ
## Federated Keyless FAQs
[Section titled “Federated Keyless FAQs”](#federated-keyless-faqs)
**What if I stop using my IAM for my application? What if I switch IAM providers?**
* An account address depends on values of several variables that are specific to an IAM service, including `aud` (client ID) and `iss` (issuer). If these values are changed, then a different address will be derived.
* If you want to switch IAM providers, you will need to develop an account migration flow, resulting in a key rotation from the account derived from the prior IAM provider to the account derived from the new IAM provider.
* We recommend allowing your users to add a secondary authentication method to their accounts (e.g., back-up private key) so that they can maintain access should the authentication path into their account via Federated Keyless be disrupted via a service provider change. In order to implement this, you need to do a key rotation to a multikey account. For relevant documentation see [key rotation](https://aptos.dev/en/build/guides/key-rotation) and [multikey SDK](https://aptos-labs.github.io/aptos-ts-sdk/@aptos-labs/ts-sdk-1.35.0/classes/MultiKeyAccount.html).
**Does using an IAM cost money?**
* Yes, IAMs usually cost money, but they can help provide useful functionality within your application such as role-based access control (authorization), user management, user authentication, security + compliance, and analytics + monitoring.
**In the case the dApp or IAM provider goes offline, how do I make sure my users can continue accessing their accounts?**
* We recommend allowing your users to add a secondary authentication method to their accounts (e.g., back-up private key) so that they can maintain access should the authentication path into their account via Federated Keyless is disrupted via service provider change or other outage.
**I use an open source IAM like Keycloak. Can I use Federated Keyless?**
* Not today. Due to the trust placed in the IAM to have sufficient uptime and security standards, we have limited the accepted IAM set to the currently supported issuers. If you believe your provider should be included for consideration, please consider raising an AIP or contact us in the Keyless developers [telegram](https://t.me/+h5CN-W35yUFiYzkx).
# Federated Keyless Simple Example
The Federated Keyless Example shows how to set up a Federated Keyless account using Auth0 as the IAM provider.
Explore the code in [aptos-keyless-example repository](https://github.com/aptos-labs/aptos-keyless-example/tree/main/examples/federated-keyless-example/).
The Keyless Simple Example is currently undergoing maintenance. Please check back later.
# How Keyless Works
Aptos Keyless enables a dApp to **derive** and **access** a blockchain account for a user who successfully signed in to the dApp via an OIDC provider (e.g., Google). Importantly, this blockchain account is **scoped to the dApp**. This means other dApps, who can similarly sign-in the same user, via the same OIDC provider, are not able to access this account and instead get their own account.
*But how does this work?*
This article will explain the full keyless flow depicted below, from the user first signing into a dapp, to obtaining her zero-knowledge proof and to, finally, transacting on-chain.

## Overview
[Section titled “Overview”](#overview)
At a very high level, a successful sign-in into the dApp via the OIDC provider will result in the dApp receiving a **JSON Web Token (JWT)** signed by the OIDC provider. The JWT will contain, among other things, three important pieces of information:
1. The user’s identity (contained in the JWT’s `sub` field)
2. The dApp’s identity (contained in the JWT’s `aud` field)
3. Application-specific data; specifically, an **ephemeral public key (EPK)** (contained in the JWT’s `nonce` field), whose associated **ephemeral secret key (ESK)** only the user knows.
Now, assume that the user’s blockchain account address is (more or less) a hash of the user’s identity in `sub` and the dApp’s identity in `aud` from above.
Then, the **key observation** is that the signed JWT effectively acts as a **digital certificate**, **temporarily** binding this blockchain address to the EPK, and allowing the EPK to sign TXNs for it. In other words, it securely delegates TXN signing rights for this blockchain account to the EPK (Note: The EPK contains an expiration date and is thus short-lived).
Importantly, if the user loses their ESK, the user can obtain a new signed JWT over a new EPK via the application by simply signing in again via the OIDC provider (Or, in some cases, by requesting a new signed JWT using an OAuth refresh token).
With this system, the **challenge** is maintaining privacy, since revealing the JWT on-chain would leak the user’s identity. Furthermore, revealing the EPK to the OIDC provider would allow it to track the user’s TXNs on-chain.
We explain below how Keyless accounts work and how they address these challenges.
## Flow: Deriving a keyless account for a user in a dApp
[Section titled “Flow: Deriving a keyless account for a user in a dApp”](#flow-deriving-a-keyless-account-for-a-user-in-a-dapp)
First, let us look at how a dApp can sign-in a user via (say) Google, derive that user’s keyless blockchain address and, for example, send that user an asset.

**Step 1**: The user generates an ephemeral key pair: an EPK with an expiration date, and its associated ESK. The dApp keeps the EPK and safely stores the ESK on the user-side (e.g., in the browser’s local storage, or in a trusted enclave if the ESK is a WebAuthn passkey).
**Step 2**: The dApp commits to the EPK as H(epk,ρ), where ρ is a blinding factor. When the user clicks on the “Sign in with Google” button, the dApp redirects the user to Google’s sign in page and, importantly, sets the `nonce` parameter in the URL to this EPK commitment. This hides the EPK from Google, maintaining privacy of the user’s TXN activity.
**Step 3**: Typically, the user has an HTTP cookie from having previously-signed-in to their Google account, so Google merely checks this cookie. If the user has multiple Google accounts, Google asks the user to select which one they want to sign-in into the dApp (The less common path is for the user to have to type in their Google username and password).
**Step 4**: Once the user has signed in, Google sends the dApp a signed JWT, which includes the user’s `sub` identifier (e.g., `uid-123`), the application’s `aud` identifier (e.g., `"dapp-xyz"`) and the `nonce` with the EPK commitment (This assumes that the dApp has previously registered with Google and received this `"dapp-xyz"` identifier).
**Step 5**: The dApp now has almost everything it needs to derive a keyless account for the user: the user’s identifier (`sub`) and the dApp’s identifier (`aud`). But, to preserve the privacy of the user, the dApp will use a third piece of information: a blinding factor r called a **pepper**. The dApp will contact a so-called **guardian** who will deterministically derive a random r from the given (`sub`, `aud`). Importantly, the guardian will only reveal r to the dApp upon seeing a validly-signed JWT for the queried (`sub`, `aud`).
**Step 6**: The dApp derives the address of the account as addr=H("uid-123","dapp-xyz",r), where H is a cryptographic hash function.
Note that the pepper r is used to hide the user and app identity inside the address since, as we described above, only an authorized user with a valid JWT will be able to obtain this pepper.
Also, note that the address is independent of the EPK. This is why the ESK need not be long-lived and can be lost.
Finally, the dApp can, for example, send an NFT to the user at their address addr.
But how can the dApp authorize TXN from this account at addr? We discuss that next.
## Flow: Obtaining a zero-knowledge proof before transacting
[Section titled “Flow: Obtaining a zero-knowledge proof before transacting”](#flow-obtaining-a-zero-knowledge-proof-before-transacting)
In the previous flow, we showed how a dApp can sign in a Google user and derive their privacy-preserving keyless address, with the help of a guardian.
Next, we show how this dApp can obtain a zero-knowledge proof (ZKP), which will allow it to authorize transactions from this address for the user. Importantly, the transaction will hide the user’s identifying information (e.g., the `sub` field).

**Step 1**: The dApp sends all the necessary public information (i.e., epk, GPK) and private information (i.e., JWT, signature σ\_G from Google, EPK blinding factor ρ, and pepper r) to the **prover service**.
**Step 2**: The prover derives the user’s address addr and computes a zero-knowledge proof (ZKP) π for the keyless relation R\_keyless (described below). This proof acts as a **privacy-preserving** digital certificate, and binds the user’s address addr to the ephemeral public key epk. The prover then sends π to the dApp.
In order to bind the epk with the user’s address addr, the ZKP will be used to convince the validators that the user is in possession of (1) a JWT signed by Google, (2) which commits to the epk in its `nonce` field, and (3) contains the same information as in the address, without leaking anything about the JWT, its signature σ\_G, ρ, or r.
More formally, the ZKP π convinces a verifier (i.e., the blockchain), who has public inputs (addr,epk,GPK), that the prover knows secret inputs (jwt,σ\_G,ρ,r) such that the relation R\_keyless depicted below holds:

Recall from before that the signed JWT itself binds the blockchain address addr to epk, so that epk can sign transactions for addr. However, the JWT would leak the user’s identity, so the ZKP serves to hide the JWT (and other private information) while arguing that the proper checks hold (i.e., the checks in R\_keyless).
Next, we show how the dApp can now authorize TXNs from addr.
## Flow: Sending a TXN from a keyless account
[Section titled “Flow: Sending a TXN from a keyless account”](#flow-sending-a-txn-from-a-keyless-account)
The previous flow explained how a dApp can obtain a ZKP from the prover service. Next, we describe how the dApp leverages this ZKP to transact for the account.

**Step 1**: The dApp obtains an ephemeral signature σ\_eph over the TXN from the user. This could be done behind the user’s back, by the dApp itself who might manage the ESK. Or, it could be an actual signing request sent to the user, such as when the ESK is a WebAuthn passkey, which is stored on the user’s trusted hardware.
**Step 2**: The dApp sends the TXN, the ZKP π, the ephemeral public key epk, and the ephemeral signature σ\_eph to the blockchain validators.
**Step 3**: To check the TXN is validly-signed, the validators perform several steps: (1) check that epk has not expired, (2) fetch the user’s address addr from the TXN, (3) verify the ZKP against (addr,epk,GPK), and (4) verify the ephemeral signature σ\_eph on the TXN against the epk. If all these checks pass, they can safely execute the TXN.
## Want more?
[Section titled “Want more?”](#want-more)
The key ideas behind keyless accounts are also explained in this 20 minute presentation below.
[Play](https://youtube.com/watch?v=sKqeGR4BoI0)
# Keyless Integration Guide
Note
**Keyless Account Scoping**
Use of the \*\**Aptos Keyless Integration Guide*\*\* will allow for the integration of keyless accounts directly into your application. This means that blockchain accounts are scoped to your application’s domain (logging in with your Google account on dApp A and logging in with your Google account on dApp B will create separate accounts). Stay tuned for more to come on Aptos’ plan to allow Keyless accounts to be used portably across applications.
To provide feedback, get support, or be a design partner as we enhance Aptos Keyless, join us here:
At a high level, there are three steps to follow in order to integrate Keyless Accounts.
1. **Configure your OpenID integration with your IdP.** In this step, the dApp will register with the IdP of choice (e.g. Google) and receive a `client_id`
2. **Install the Aptos TypeScript SDK.**
3. **Integrate Keyless Account support in your application client**
1. Set up the `"Sign In with [Idp]"` flow for your user.
2. Instantiate the user’s `KeylessAccount`
3. Sign and submit transactions via the `KeylessAccount`.
## Example Implementation
[Section titled “Example Implementation”](#example-implementation)
You can find an example app demonstrating basic Keyless integration with Google in the [aptos-keyless-example repository](https://github.com/aptos-labs/aptos-keyless-example/). Follow the directions in the README to start with the example. For more detailed instructions on keyless, please read the rest of this integration guide.
1. Step 1. Configure your OpenID integration with your IdP
The first step is to setup the configuration with your IdP(s).
[Follow the instructions here](/build/guides/aptos-keyless/oidc-support)
2. Step 2. Install the Aptos TypeScript SDK
```shellscript
# Keyless is supported in version 1.18.1 and above
pnpm install @aptos-labs/ts-sdk
```
3. Step 3. Client Integration Steps
Below are the default steps for a client to integrate Keyless Accounts
#### 1. Present the user with a “Sign In with \[IdP]” button on the UI
[Section titled “1. Present the user with a “Sign In with \[IdP\]” button on the UI”](#1-present-the-user-with-a-sign-in-with-idp-button-on-the-ui)
1. In the background, we create an ephemeral key pair. Store this in local storage.
```typescript
import {EphemeralKeyPair} from '@aptos-labs/ts-sdk';
const ephemeralKeyPair = EphemeralKeyPair.generate();
```
2. Save the `EphemeralKeyPair` in local storage, keyed by its `nonce`.
```typescript
// This saves the EphemeralKeyPair in local storage
storeEphemeralKeyPair(ephemeralKeyPair);
```
Example implementation for `storeEphemeralKeyPair`
Note
This implementation is an example of how to store the `EphemeralKeyPair` in local storage. Different implementations may be used according to your application’s needs.
```typescript
/**
* Store the ephemeral key pair in localStorage.
*/
export const storeEphemeralKeyPair = (ekp: EphemeralKeyPair): void =>
localStorage.setItem("@aptos/ekp", encodeEphemeralKeyPair(ekp));
/**
* Retrieve the ephemeral key pair from localStorage if it exists.
*/
export const getLocalEphemeralKeyPair = (): EphemeralKeyPair | undefined => {
try {
const encodedEkp = localStorage.getItem("@aptos/ekp");
return encodedEkp ? decodeEphemeralKeyPair(encodedEkp) : undefined;
} catch (error) {
console.warn(
"Failed to decode ephemeral key pair from localStorage",
error
);
return undefined;
}
};
/**
* Stringify the ephemeral key pairs to be stored in localStorage
*/
export const encodeEphemeralKeyPair = (ekp: EphemeralKeyPair): string =>
JSON.stringify(ekp, (_, e) => {
if (typeof e === "bigint") return { __type: "bigint", value: e.toString() };
if (e instanceof Uint8Array)
return { __type: "Uint8Array", value: Array.from(e) };
if (e instanceof EphemeralKeyPair)
return { __type: "EphemeralKeyPair", data: e.bcsToBytes() };
return e;
});
/**
* Parse the ephemeral key pairs from a string
*/
export const decodeEphemeralKeyPair = (encodedEkp: string): EphemeralKeyPair =>
JSON.parse(encodedEkp, (_, e) => {
if (e && e.__type === "bigint") return BigInt(e.value);
if (e && e.__type === "Uint8Array") return new Uint8Array(e.value);
if (e && e.__type === "EphemeralKeyPair")
return EphemeralKeyPair.fromBytes(e.data);
return e;
});
```
3. Prepare the URL params of the login URL. Set the `redirect_uri` and `client_id` to your configured values with the IdP. Set the `nonce` to the nonce of the `EphemeralKeyPair` from step 1.1.
```typescript
const redirectUri = 'https://.../login/callback'
const clientId = env.IDP_CLIENT_ID
// Get the nonce associated with ephemeralKeyPair
const nonce = ephemeralKeyPair.nonce
```
4. Construct the login URL for the user to authenticate with the IdP. Make sure the `openid` scope is set. Other scopes such as `email` and `profile` can be set based on your app’s needs.
```typescript
const loginUrl = `https://accounts.google.com/o/oauth2/v2/auth?response_type=id_token&scope=openid+email+profile&nonce=${nonce}&redirect_uri=${redirectUri}&client_id=${clientId}`
```
5. When the user clicks the login button, redirect the user to the `loginUrl` that was created in step 1.4.
#### 2. Handle the callback by parsing the token and create a Keyless account for the user
[Section titled “2. Handle the callback by parsing the token and create a Keyless account for the user”](#2-handle-the-callback-by-parsing-the-token-and-create-a-keyless-account-for-the-user)
1. Once the user completes the login flow, they will be redirected to the `redirect_uri` set in step 1. The JWT will be set in the URL as a search parameter in a URL fragment, keyed by `id_token`. Extract the JWT from the `window` by doing the following:
```typescript
const parseJWTFromURL = (url: string): string | null => {
const urlObject = new URL(url);
const fragment = urlObject.hash.substring(1);
const params = new URLSearchParams(fragment);
return params.get('id_token');
};
// window.location.href = https://.../login/google/callback#id_token=...
const jwt = parseJWTFromURL(window.location.href)
```
2. Decode the JWT and get the extract the nonce value from the payload.
```typescript
import { jwtDecode } from 'jwt-decode';
const payload = jwtDecode<{ nonce: string }>(jwt);
const jwtNonce = payload.nonce
```
3. Fetch the `EphemeralKeyPair` stored in step 1.2. Make sure to validate the nonce matches the decoded nonce and that the `EphemeralKeyPair` is not expired.
```typescript
const ekp = getLocalEphemeralKeyPair();
// Validate the EphemeralKeyPair
if (!ekp || ekp.nonce !== jwtNonce || ekp.isExpired() ) {
throw new Error("Ephemeral key pair not found or expired");
}
```
4. Instantiate the user’s `KeylessAccount`
Depending on the type of Keyless you are using, follow the instructions below:
1. Normal Keyless
```tsx
import {Aptos, AptosConfig, Network} from '@aptos-labs/ts-sdk';
const aptos = new Aptos(new AptosConfig({ network: Network.DEVNET })); // Configure your network here
const keylessAccount = await aptos.deriveKeylessAccount({
jwt,
ephemeralKeyPair,
});
```
2. Federated Keyless
```tsx
import {Aptos, AptosConfig, Network} from '@aptos-labs/ts-sdk';
const aptos = new Aptos(new AptosConfig({ network: Network.DEVNET })); // Configure your network here
const keylessAccount = await aptos.deriveKeylessAccount({
jwt,
ephemeralKeyPair,
jwkAddress: jwkOwner.accountAddress
});
```
#### 3. Store the KeylessAccount in local storage (Optional)
[Section titled “3. Store the KeylessAccount in local storage (Optional)”](#3-store-the-keylessaccount-in-local-storage-optional)
1. After the account has been derived, store the `KeylessAccount` in local storage. This allows the user to return to the application without having to re-authenticate.
```typescript
export const storeKeylessAccount = (account: KeylessAccount): void =>
localStorage.setItem("@aptos/account", encodeKeylessAccount(account));
export const encodeKeylessAccount = (account: KeylessAccount): string =>
JSON.stringify(account, (_, e) => {
if (typeof e === "bigint") return { __type: "bigint", value: e.toString() };
if (e instanceof Uint8Array)
return { __type: "Uint8Array", value: Array.from(e) };
if (e instanceof KeylessAccount)
return { __type: "KeylessAccount", data: e.bcsToBytes() };
return e;
});
```
2. Whenever the user returns back to the application, retrieve the `KeylessAccount` from local storage and use it to sign transactions.
```typescript
export const getLocalKeylessAccount = (): KeylessAccount | undefined => {
try {
const encodedAccount = localStorage.getItem("@aptos/account");
return encodedAccount ? decodeKeylessAccount(encodedAccount) : undefined;
} catch (error) {
console.warn(
"Failed to decode account from localStorage",
error
);
return undefined;
}
};
export const decodeKeylessAccount = (encodedAccount: string): KeylessAccount =>
JSON.parse(encodedAccount, (_, e) => {
if (e && e.__type === "bigint") return BigInt(e.value);
if (e && e.__type === "Uint8Array") return new Uint8Array(e.value);
if (e && e.__type === "KeylessAccount")
return KeylessAccount.fromBytes(e.data);
return e;
});
```
#### 4. Submit transactions to the Aptos blockchain
[Section titled “4. Submit transactions to the Aptos blockchain”](#4-submit-transactions-to-the-aptos-blockchain)
1. Create the transaction you want to submit. Below is a simple coin transfer transaction for example:
```tsx
import {Account} from '@aptos-labs/ts-sdk';
const bob = Account.generate();
const transaction = await aptos.transferCoinTransaction({
sender: keylessAccount.accountAddress,
recipient: bob.accountAddress,
amount: 100,
});
```
2. Sign and submit the transaction to the chain.
```tsx
const committedTxn = await aptos.signAndSubmitTransaction({ signer: keylessAccount, transaction });
```
3. Wait for the transaction to be processed on-chain
```tsx
const committedTransactionResponse = await aptos.waitForTransaction({ transactionHash: committedTxn.hash });
```
# Keyless Introduction
Keyless accounts represent a pivotal advancement within the Aptos ecosystem, revolutionizing the way users onboard and interact with decentralized applications (dApps). Aptos Keyless allows users to gain ownership of a **self-custodial** Aptos blockchain account from their existing OpenID Connect (OIDC) account(s) (e.g., Sign in with Google; Sign in with Apple), rather than from a traditional secret key or mnemonic. In a nutshell, with Aptos Keyless, a user’s blockchain account is their OIDC account. Over time, Keyless will evolve to support many IdPs who support the OIDC standard, but we will begin with support for the providers listed [here](/build/guides/aptos-keyless/oidc-support).
At the core of the keyless accounts paradigm lies a deep understanding of user experience and security challenges prevalent in traditional blockchain systems. Managing private keys, the cornerstone of user identity and asset ownership, often proves cumbersome and error-prone for users, particularly those lacking technical expertise. Keyless accounts offer an elegant solution by obviating the need for users to grapple with the intricacies of private key management. Instead, users authenticate themselves through access to common social sign in options like Google, Apple, and many more. With this new system comes some important tradeoffs to understand on behalf of your users before implementing Keyless in your application. The following pages will expand on the benefits of Keyless accounts, how to integrate, the system architecture, and FAQs. For a more verbose and technical dive into Keyless accounts, please see [AIP-61-Keyless Accounts](https://github.com/aptos-foundation/AIPs/blob/main/aips/aip-61.md).
There are two ways to interact with Keyless accounts in the Aptos ecosystem. Developers are able to either 1) integrate the Aptos Keyless SDK directly into their dApp or 2) integrate a wallet, like Aptos Connect, that supports Keyless account creation. This documentation will focus on case #1 and more details on #2 can be found [here](https://aptosconnect.app/docs/). Please note that a direct integration of the Keyless SDK will result in user accounts being domain specific to your dApp whereas the use of a wallet integration will allow your users to carry their accounts to any application that supports that wallet.
Note: the Aptos Keyless SDK and Aptos Connect are representative examples of the aforementioned product experience, but developers in our ecosystem are building alternatives, like a Keyless Unity SDK and alternative wallet products with Keyless integration.
## Aptos Keyless Benefits
[Section titled “Aptos Keyless Benefits”](#aptos-keyless-benefits)
Keyless accounts are revolutionary to users for the following reasons:
1. **Simplified login user experience**: “1-click” account creation via familiar Web2 logins like Sign In with Google.
2. **Enhanced dApp user experience**: Ability to transact on the Aptos blockchain without needing to navigate away from the application experience to download a wallet.
3. **Secure key management**: Requires no manual secret key management by the user. Users sign transactions with the JSON Web Token (JWT) token issued by OIDC providers. As such, blockchain account access is synonymous with access to one’s OIDC account
4. **Improved account recovery**: Web2-like recovery flows are available to regain access to one’s blockchain account in case the user ever loses access to their OIDC account.
5. **Seamless cross-device experiences**: Users log in with their OIDC account no matter what device they are on - no need to download wallet software on each device, import their keys and encrypt them with a password, which must be maintained.
With these benefits, come some important structural components of Keyless accounts for developers to be aware of. You can see more on this in our FAQs.
# Keyless OIDC Support
Aptos Keyless supports the following IdPs and IAM providers on our network(s). Support for additional IdPs to come. Please reach out if you have need for coverage for a specific use case.
| Identity Provider | Federated Only | Devnet | Testnet | Mainnet |
| ----------------- | -------------- | --------- | ------- | ------- |
| Google | No | Live | Live | Live |
| Apple | No | Live | Live | Live |
| Auth0 | Yes | Live | Live | Live |
| Cognito | Yes | Live | Live | Live |
| Microsoft | No | In review | - | - |
| Github | No | In review | - | - |
| Facebook | No | In review | - | - |
If your identity provider is marked as “Federated Only”, you will need to follow the instructions for [Federated Keyless](/build/guides/aptos-keyless/federated-keyless).
To integrate Aptos Keyless into your dApp, you must register your dApp with at least one of the available identity providers via their OIDC registration process. Each respective registration process will assign a Client ID to your application, which will serve as an identifier for your application in the Keyless architecture.
## Registering your dApp with Google
[Section titled “Registering your dApp with Google”](#registering-your-dapp-with-google)
1. Step 1: Sign in to Google Developer Console
1. Navigate to the [Google Cloud Console](https://console.cloud.google.com/).
2. Sign in with your Google account credentials.
2. Step 2: Create a New Project
1. If you don’t have an existing project, click on the “Select a project” dropdown menu at the top of the page and choose “New Project.”
2. Enter a name for your project and click “Create.” Detailed instructions can be found [here](https://cloud.google.com/resource-manager/docs/creating-managing-projects#creating_a_project).
3. Step 3: Configure Consent Screen
1. In the left sidebar, navigate to “APIs & Services” > “OAuth consent screen.”
2. Choose “External” user type and click “Create.”
3. Enter the required details such as the application name, user support email, and developer contact information.
4. Optionally, add additional details like the application logo and privacy policy URL.
5. Click “Save and continue.” Detailed steps are available [here](https://developers.google.com/workspace/guides/create-credentials#configure_the_oauth_consent_screen).
4. Step 4: Register Your Application
1. In the left sidebar, navigate to “APIs & Services” > “Credentials.” 
2. Click on “Create Credentials” and select “OAuth client ID.” 
3. Choose the application type (e.g., Web application, Desktop app, or Mobile app).
4. Enter the necessary details such as the name of your application and the authorized redirect URIs. For OIDC, the redirect URIs should follow the format .
5. Click “Create.”
5. Step 5: Obtain Client ID and Client Secret
1. After creating the OAuth client ID, Google will provide you with a client ID and client secret. These credentials are essential for authenticating your application.
2. Note down the client ID and client secret securely. Do not expose them publicly.
6. Step 6: Configure OIDC Integration in Your Application
1. Integrate OIDC authentication into your application using a suitable OIDC library or framework (e.g., Passport.js for Node.js, Spring Security for Java, or Auth0 for various platforms).
2. Use the client ID and client secret obtained from Google to configure OIDC authentication in your application settings.
3. Set up the appropriate callback URL () for handling authentication responses from Google.
## Registering your dApp with Apple
[Section titled “Registering your dApp with Apple”](#registering-your-dapp-with-apple)
1. Step 1: Sign in to Apple Developer Account
1. Go to the [Apple Developer website](https://developer.apple.com/).
2. Sign in with your Apple ID.
3. Enroll in the Apple Developer Program if not already. 
2. Step 2: Create a New App ID
1. Navigate to the “Certificates, Identifiers & Profiles” section.
2. Click on “Identifiers” in the sidebar.
3. Click the ”+” button to create a new App ID.
4. Fill in the details for your app, including the name and bundle ID.
5. Enable “Sign in with Apple” under the “Capabilities” section.
6. Click “Continue” and then “Register” to create the App ID.
3. Step 3: Generate a Private Key
1. In the “Keys” section of the “Certificates, Identifiers & Profiles” page, click the ”+” button to create a new key.
2. Enter a name for the key, enable the “Sign in with Apple” capability, and click “Continue.”
3. Download the generated private key and securely store it. This key will be used to authenticate your app with Apple’s OIDC service.
4. Step 4: Configure Redirect URIs
1. Under the “App ID” section, locate your newly created App ID and click on it.
2. Scroll down to the “Sign in with Apple” section and click on “Edit.”
3. Add the redirect URIs that your application will use for callback after authentication. The format should be .
4. Click “Save” to update the settings.
5. Step 5: Set Up Your OIDC Integration
1. Use an OIDC library or framework compatible with Apple’s OIDC service (e.g., Passport.js for Node.js, Spring Security for Java).
2. Configure your application to use the client ID and private key obtained from Apple during the registration process.
3. Set up the appropriate callback URL () for handling authentication responses from Apple.
# Keyless Terminology and FAQ
## Terminology
[Section titled “Terminology”](#terminology)
* **OpenID Connect (OIDC)**: is the identity authentication protocol used to enable federated identity verification. This protocol is what is used when a user goes through the “Sign in with Google” flow for example.
* **Identity Provider (IdP)**: is the trusted authority who authenticates your identity via OIDC. Supported example includes: Google.
* **JSON Web Token (JWT):** is an open standard used to share security information between two parties — a client and a server. Each JWT contains encoded JSON objects, including a set of claims. JWTs are signed using a cryptographic algorithm to ensure that the claims cannot be altered after the token is issued.
* `iss`, an identifier for the OIDC provider (e.g., )
* `aud`, the OAuth `client_id` of the application that the user is signing in to (e.g., [Notion.so](https://notion.so))
* `sub`, an identifier that the OIDC provider uses to identify the user
* This could be an identifier specific to this `client_id`
* Or, it could be an identifier shared across different `client_id`’s (e.g., Facebook’s OIDC does this)
* `email`, some providers might also expose the user’s email as one of the fields (e.g., Google)
* in addition, an `email_verified` field will be exposed to indicate if the provider has verified that the user owns this email address
* `nonce`, arbitrary data that the application wants the OIDC provider to sign over
* `iat`, the time the JWT was issued at.
* **Ephemeral Key Pair:** a temporary public/private key pair that is used to sign transactions for an Aptos Keyless account. The public key and its expiration date are committed in the JWT token via the `nonce` field.
* **Keyless Account:** a blockchain account that is directly-derived from (1) a user’s OIDC account (e.g., `alice@gmail.com`) and (2) an associated application’s OAuth client\_id (e.g., Notion.so). Users authenticate through the OIDC flow.
* **JSON Web Key (JWK):** is the cryptographic public key of the OIDC provider. This public key is used to verify the signature on the JWTs that the OIDC provider issues to the client application. This way, the client application can verify the authenticity of the tokens and ensure that they have not been tampered with.
* **client\_id:** the OAuth identifier for your application that you will receive from the IdP after registering your application with them. This will be used in our keyless architecture in the address derivation for your users.
* **redirect\_uri:** the URI of the callback handler once the user successfully authenticates. Needs to be registered with your IdP.
## Ceremony
[Section titled “Ceremony”](#ceremony)
Aptos engaged in iterative trusted setup ceremonies to secure our Groth16 based ZK circuit. A trusted setup ceremony is a multi-party computation (MPC) that outputs the prover and verifier keys used in a zkSNARK system, common for efficient zero-knowledge proof systems. As long as a single participant in the ceremony is honest, the process is considered secure and the outputs will be valid. Our initial ceremony consisted of 140+ members of the Aptos ecosystem, which was an incredible show of the power of decentralization, security, and community - and a follow up ceremony was held following a developer feedback phase that allowed us to identify and implement an improvement to our circuit that helped us ensure Keyless is universally accessible. Our final ceremony contributions can be found in this repo \[here] and verified using the process outlined \[here].
## Frequently Asked Questions
[Section titled “Frequently Asked Questions”](#frequently-asked-questions)
**What is the best way to use Keyless accounts?**
* The best way to use Keyless accounts depends on your use case. If seamless account interoperability across our ecosystem is important to your dApp experience (think: mint an NFT on your platform and allow users to sell their NFT on an external NFT marketplace), you might want to consider integrating a wallet that supports Keyless. If you want to create a fully embedded account experience in your dApp, allowing users to transact without ever leaving your application, you might want to do a direct integration of the Aptos Keyless SDK.
**Does Keyless work with sponsored transactions or do my users always need to pay for their own gas?**
* Yes, Keyless works with sponsored transactions like any regular private key based account.
**If I use the Aptos Keyless SDK, can my user’s use their accounts across other dApps?**
* Keyless accounts are scoped to the domain they are created with as the address derivation includes a unique identifier for the application.
**What is Aptos Connect?**
* Account Management Infrastructure: Central to the keyless accounts paradigm is a robust account management infrastructure that facilitates the creation, deletion, and management of user accounts, alongside the storage and retrieval of associated metadata.
* While the adoption of keyless accounts heralds a paradigm shift towards enhanced usability and security, it is imperative for developers to remain cognizant of tradeoffs associated with this system vs. common alternatives like plaintext private keys.
**Are there dependency on external services?**
* Yes, Keyless accounts introduce a degree of dependency on external authentication services (pepper and prover), necessitating contingency plans and fallback mechanisms to mitigate service disruptions and ensure uninterrupted user access
**If my dApp goes down, my users cannot access their Keyless accounts. How can I help protect them in that case?**
* We encourage dApp developers to support additional backup recovery options for your users when integrating Keyless into a dApp. Specifically, we recommend that you support adding a backup private key to Keyless accounts in your dApp. Practically, this would transform the accounts into 1 of 2 multi-signature accounts where both keys are owned by the user. This would allow users to continue using OIDC login via your dApp to access their Keyless accounts but would add the ability for your users to export their backup private key to any self custodial product, where they could sign transactions from that same account with their traditional private key. Doing this will ensure that users never lose access to their digital assets, even if your dApp shuts down or the user loses access to their OIDC account.
* You should make a determination at what point in the user journey to incorporate a back-up is appropriate for your dApp. Incorporating a backup method later in the user journey would preserve the seamless onboarding experience that Keyless offers but could result in less users receiving a recovery key. Prompting users to add a backup key during the onboarding process would likely lead to more users receiving a recovery key but could add potential friction during the onboarding process.
# Keyless Simple Example
Explore the code in [aptos-keyless-example repository](https://github.com/aptos-labs/aptos-keyless-example/tree/main/examples/keyless-example/).
The Keyless Simple Example is currently undergoing maintenance. Please check back later.
This is a live Keyless example on StackBlitz. Follow the instructions in the `README.md` to add your own Google `client_id`. Explore the code in [aptos-keyless-example repository](https://github.com/aptos-labs/aptos-keyless-example/tree/main/examples/keyless-example/).
[aptos-keyless-example](https://stackblitz.com/edit/vitejs-vite-3fuvtu?embed=1\&file=README.md)
# Build an End-to-End Dapp on Aptos
A common way to learn a new framework or programming language is to build a simple todo list. In this tutorial, we will learn how to build an end-to-end todo list dapp, starting from the smart contract side through the front-end side and finally use of a wallet to interact with the two.
See the completed code in the [source-code](https://github.com/aptos-labs/developer-docs/tree/main/apps/nextra/pages/en/build/guides/build-e2e-dapp).
## Chapters
[Section titled “Chapters”](#chapters)
After meeting the [prerequisites](#prerequisites) and [getting set up](#setup) as described below, you will follow this tutorial in this order:
1. [Create a smart contract](/build/guides/build-e2e-dapp/1-create-smart-contract)
2. [Set up a frontend](/build/guides/build-e2e-dapp/2-set-up-the-frontend)
3. [Fetch Data from Chain](/build/guides/build-e2e-dapp/3-fetch-data-from-chain)
4. [Submit data to chain](/build/guides/build-e2e-dapp/4-submit-data-to-chain)
5. [Handle Tasks](/build/guides/build-e2e-dapp/5-handle-tasks)
## Prerequisites
[Section titled “Prerequisites”](#prerequisites)
You must have:
* [node and npm](https://nodejs.org/en/)
Although we will explain some React decisions, we are not going to deep dive into how React works; so we assume you have some previous experience with React.
## Setup
[Section titled “Setup”](#setup)
In this section, we will create a `my-first-dapp` directory to hold our project files, both client-side code (React based) and the Move code (our smart contract).
For that, we will be using [create-aptos-dapp](/build/create-aptos-dapp) to create the project.
1. Open a terminal and navigate to the desired directory for the project (for example, the `Desktop` directory).
2. Run `npx create-aptos-dapp@latest` to create the project.
```shellscript
npx create-aptos-dapp@latest
```
3. Follow the instructions to create the project.
* Choose a name for the project, for example `my-first-dapp`.
* Choose the `Full-stack project` option.
* Choose the `Boilerplate Template` option.
* For simplicity, choose not to use Surf.
* Choose the `Vite app` framework option.
* Choose the `Devnet` network option.
The tool will create the project in a directory with the same name as the project and install the required dependencies.
Follow the `Next Steps` instructions.
Now let’s [create a smart contract](/build/guides/build-e2e-dapp/1-create-smart-contract).
# 1. Create a Smart Contract
This is the first chapter of the tutorial on [building an end-to-end dapp on Aptos](/build/guides/build-e2e-dapp). If you haven’t done it, review that introduction, and ensure your environment meets the [prerequisites](/build/guides/build-e2e-dapp#prerequisites) listed there.
Now that you are all set up, let’s explore the `contract` directory.

### What is a `Move.toml` file?
[Section titled “What is a Move.toml file?”](#what-is-a-movetoml-file)
A `Move.toml` file is a manifest file that contains metadata such as name, version, and dependencies for the package.
Take a look at the new `Move.toml` file. You should see your package information and an `AptosFramework` dependency. The `AptosFramework` dependency points to the `aptos-core/aptos-move/framework/aptos-framework` GitHub repo main branch.
### Why `sources` directory?
[Section titled “Why sources directory?”](#why-sources-directory)
The `sources` directory holds a collection of `.move` modules files. And later when we want to compile the package using the CLI, the compiler will look for that `sources` directory and its `Move.toml` file.
### What is the `tests` directory?
[Section titled “What is the tests directory?”](#what-is-the-tests-directory)
The `tests` directory holds `.move` files that are used to test the files in our `sources` directory.
### Create a Move module
[Section titled “Create a Move module”](#create-a-move-module)
An account is needed to publish a Move module. When we installed the template, the tool created a new account for us and added it to the `.env` file. If you open that file, you will see content resembling:
```shellscript
PROJECT_NAME=my-aptos-dapp
VITE_APP_NETWORK=devnet
VITE_APTOS_API_KEY=""
VITE_MODULE_PUBLISHER_ACCOUNT_ADDRESS=0x1cecfef9e239eff12fb1a3d189a121c37f48908d86c0e9c02ec103e0a05ddebb
#This is the module publisher account's private key. Be cautious about who you share it with, and ensure it is not exposed when deploying your dApp.
VITE_MODULE_PUBLISHER_ACCOUNT_PRIVATE_KEY=0x84638fd5c42d0937503111a587307169842f355ab661b5253c01cfe389373f43
```
Note
You just created a new account on the Aptos (dev) network! Yay! You can see it by going to the Aptos Explorer Devnet network view, pasting the `VITE_MODULE_PUBLISHER_ACCOUNT_ADDRESS` value from your `.env` file into the search field, and clicking on the dropdown option!
The Boilerplate template comes with a pre generated `message_board.move` file, a relevant test file and a `Move.toml` file.
As mentioned, our sources directory holds our `.move` module files; so let’s create a new `todolist.move` file.
1. Create a new `todolist.move` file within the `sources` directory and add the following to that file:
```move
module todolist_addr::todolist {
}
```
2. Open the `Move.toml` file.
3. Add the following code to that Move file:
```toml
[addresses]
todolist_addr='_'
```
Note
A Move module is stored under an address (so when it published anyone can access it using that address); the syntax for a Move module is
```move
module :: {
}
```
In our module, the `account-address` is `todolist_addr` (a variable we just declared on the `Move.toml` file in the previous step that holds an `address`), and the `module-name` is `todolist` (a random name we selected).
### What is the `'_'` in the `Move.toml` file?
[Section titled “What is the '\_' in the Move.toml file?”](#what-is-the-_-in-the-movetoml-file)
The `'_'` is a placeholder for the account address. When we run the `move` compiler, the compiler will replace it with the actual account address.
`create-aptos-dapp` comes with premade scripts to easily run `move` commands, like `compile`, `test` and `publish`.
1. Open each of the files in the `scripts/move` directory and update the `message_board_addr` variable to be `todolist_addr`.
```js
...
namedAddresses: {
todolist_addr: process.env.VITE_MODULE_PUBLISHER_ACCOUNT_ADDRESS,
},
...
```
Note
Later, when we will run the each of the `move` commands, it will run these scripts, and replace the `'_'` with the actual account address that assigned to the `todolist_addr` variable.
### Our contract logic
[Section titled “Our contract logic”](#our-contract-logic)
Before jumping into writing code, let’s first understand what we want our smart contract program to do. For ease of understanding, we will keep the logic pretty simple:
1. An account creates a new list.
2. An account creates a new task on their list.
* Whenever someone creates a new task, emit a `task_created` event.
3. Let an account mark their task as completed.
Note
Creating an event is not mandatory yet useful if dapps/users want to monitor data, such as how many people create a new task, using the [Aptos Indexer](/build/indexer).
We can start with defining a `TodoList` struct, that holds the:
* tasks array
* new task event
* a task counter that counts the number of created tasks (we can use that to differentiate between the tasks)
And also create a `Task` struct that holds:
* the task ID - derived from the TodoList task counter.
* address - the account address who created that task.
* content - the task content.
* completed - a boolean that marks whether that task is completed or not.
On the `todolist.move` file, update the content in the module with:
```move
...
struct TodoList has key {
tasks: Table,
set_task_event: event::EventHandle,
task_counter: u64
}
struct Task has store, drop, copy {
task_id: u64,
address:address,
content: String,
completed: bool,
}
...
```
**What did we just add?**
**TodoList**
A struct that has the `key` and `store` abilities:
* `Key` ability allows struct to be used as a storage identifier. In other words, `key` is an ability to be stored at the top-level and act as a storage. We need it here to have `TodoList` be a resource stored in our user account.
When a struct has the `key` ability, it turns this struct into a `resource`:
* `Resource` is stored under the account - therefore it *exists* only when assigned to an account and can be *accessed* through this account only.
**Task**
A struct that has the `store`, `drop` and `copy`abilities.
• `Store` - Task needs `Store` as it’s stored inside another struct (TodoList)
• `Copy` - value can be *copied* (or cloned by value).
• `Drop` - value can be *dropped* by the end of scope.
Let’s try to compile what we have now:
1. Run: `npm run move:compile`
**Seeing errors?!** Let’s understand them.
We have some errors on `Unbound type`- this is happening because we used some types but never imported them, and the compiler doesn’t know where to get them from.
3. On the top of the module, import those types by adding:
```move
...
use aptos_framework::event;
use std::string::String;
use aptos_std::table::Table;
...
```
That will tell the compiler where it can get those types from.
2. Run the `npm run move:compile` command again; If all goes well, we should see a response resembling (where the resulting account address is your default profile account address):
```shellscript
Compiling, may take a little while to download git dependencies...
UPDATING GIT DEPENDENCY https://github.com/aptos-labs/aptos-core.git
INCLUDING DEPENDENCY AptosFramework
INCLUDING DEPENDENCY AptosStdlib
INCLUDING DEPENDENCY MoveStdlib
BUILDING MessageBoard
{
"Result": [
"1cecfef9e239eff12fb1a3d189a121c37f48908d86c0e9c02ec103e0a05ddebb::message_board",
"1cecfef9e239eff12fb1a3d189a121c37f48908d86c0e9c02ec103e0a05ddebb::todolist"
]
}
```
At this point, we have successfully compiled our Move module. Yay!
We can also delete the `message_board.move` file, as we won’t be using it. And remove the `message_board_addr` from the `Move.toml` file.
3. Let’s make sure everything is still working by running the `npm run move:compile` command again.
We also have a new `move/build` directory (created by the compiler) that holds our compiled modules, build information and `sources` directory.
### Create list function
[Section titled “Create list function”](#create-list-function)
The first thing an account can and should do with our contract is to create a new list.
Creating a list is essentially submitting a transaction, and so we need to know the `signer` who signed and submitted the transaction:
1. Add a `create_list` function that accepts a `signer`
```move
public entry fun create_list(account: &signer){
}
```
**Let’s understand the components of this function**
* `entry` - an *entry* function is a function that can be called via transactions. Simply put, whenever you want to submit a transaction to the chain, you should call an entry function.
* `&signer` - The **signer** argument is injected by the Move VM as the address who signed that transaction.
Our code has a `TodoList` resource. Resource is stored under the account; therefore, it *exists* only when assigned to an account and can be *accessed* only through this account.
That means to create the `TodoList` resource, we need to assign it to an account that only this account can have access to.
The `create_list` function can handle that `TodoList` resource creation.
2. Add the following to the `create_list` function
```move
public entry fun create_list(account: &signer){
let tasks_holder = TodoList {
tasks: table::new(),
set_task_event: account::new_event_handle(account),
task_counter: 0
};
// move the TodoList resource under the signer account
move_to(account, tasks_holder);
}
```
This function takes in a `signer`, creates a new `TodoList` resource, and uses `move_to` to have the resource stored in the provided signer account.
### Create task function
[Section titled “Create task function”](#create-task-function)
As mentioned before, our contract has a create task function that lets an account create a new task. Creating a task is also essentially submitting a transaction, and so we need to know the `signer` who signed and submitted the transaction. Another element we want to accept in our function is the task `content`.
1. Add a `create_task` function that accepts a `signer` and task `content` and the function logic.
```move
public entry fun create_task(account: &signer, content: String) acquires TodoList {
// gets the signer address
let signer_address = signer::address_of(account);
// gets the TodoList resource
let todo_list = borrow_global_mut(signer_address);
// increment task counter
let counter = todo_list.task_counter + 1;
// creates a new Task
let new_task = Task {
task_id: counter,
address: signer_address,
content,
completed: false
};
// adds the new task into the tasks table
table::upsert(&mut todo_list.tasks, counter, new_task);
// sets the task counter to be the incremented counter
todo_list.task_counter = counter;
// fires a new task created event
event::emit_event(
&mut borrow_global_mut(signer_address).set_task_event,
new_task,
);
}
```
2. Since we now use two new modules - signer and table (you can see it being used in `signer::` and `table::`) - we need to import these modules. At the top of the file, add those two use statements:
```move
use std::signer;
use aptos_std::table::{Self, Table}; // This one we already have, need to modify it
```
**Back to the code; what is happening here?**
* First, we want to get the signer address, so we can get this account’s `TodoList` resource.
* Then, we retrieve the `TodoList` resource with the `signer_address`; with that we have access to the `TodoList` properties.
* We can now increment the `task_counter` property, and create a new `Task` with the `signer_address`, `counter` and the provided `content`.
* We push it to the `todo_list.tasks` table that holds all of our tasks along with the new `counter` (which is the table key) and the newly created Task.
* Then we assign the global `task_counter` to be the new incremented counter.
* Finally, we emit the `task_created` event that holds the new Task data. `emit_event` is an `aptos-framework` function that accepts a reference to the event handle and a message. In our case, we are passing the function a reference (using the sign &) to the account’s `TodoListresource` `set_task_event` property as the first argument and a second message argument which is the new Task we just created. Remember, we have a `set_task_event` property in our `TodoList` struct.
### Complete task function
[Section titled “Complete task function”](#complete-task-function)
Another function we want our contract to hold is the option to mark a task as completed.
1. Add a `complete_task` function that accepts a `signer` and a `task_id`:
```move
public entry fun complete_task(account: &signer, task_id: u64) acquires TodoList {
// gets the signer address
let signer_address = signer::address_of(account);
// gets the TodoList resource
let todo_list = borrow_global_mut(signer_address);
// gets the task matches the task_id
let task_record = table::borrow_mut(&mut todo_list.tasks, task_id);
// update task as completed
task_record.completed = true;
}
```
**Let’s understand the code.**
* As before in our create list function, we retrieve the `TodoList` struct by the signer address, so we can have access to the tasks table that holds all the account tasks.
* Then, we look for the task with the provided `task_id` on the `todo_list.tasks` table.
* Finally, we update that task completed property to be true.
Now try to compile the code:
2. Run: `npm run move:compile`
3. Another `Unbound` error? To fix this, add a `use` statement to use the `account` module.
```move
use aptos_framework::account;
```
4. run `npm run move:compile` again.
### Add validations
[Section titled “Add validations”](#add-validations)
As this code now compiles, we want to have some validations and checks before creating a new task or updating the task as completed, so we can be sure our functions work as expected.
1. Add a check to the `create_task` function to make sure the signer account has a list:
```move
public entry fun create_task(account: &signer, content: String) acquires TodoList {
// gets the signer address
let signer_address = signer::address_of(account);
// assert signer has created a list
assert!(exists(signer_address), 1);
...
}
```
2. Add a check to the `complete_task` function to make sure the:
* signer has created a list.
* task exists.
* task is not completed.
With:
```move
public entry fun complete_task(account: &signer, task_id: u64) acquires TodoList {
// gets the signer address
let signer_address = signer::address_of(account);
// assert signer has created a list
assert!(exists(signer_address), 1);
// gets the TodoList resource
let todo_list = borrow_global_mut(signer_address);
// assert task exists
assert!(table::contains(&todo_list.tasks, task_id), 2);
// gets the task matched the task_id
let task_record = table::borrow_mut(&mut todo_list.tasks, task_id);
// assert task is not completed
assert!(task_record.completed == false, 3);
// update task as completed
task_record.completed = true;
}
```
We just added our first `assert` statements!
If you noticed, `assert` accepts two arguments: the first is what to check for, and the second is an error code. Instead of passing in an arbitrary number, a convention is to declare `errors` on the top of the module file and use these instead.
On the top of the module file (under the `use` statements), add those error declarations:
```move
// Errors
const ENOT_INITIALIZED: u64 = 1;
const ETASK_DOESNT_EXIST: u64 = 2;
const ETASK_IS_COMPLETED: u64 = 3;
```
Now we can update our asserts with these constants:
```move
public entry fun create_task(account: &signer, content: String) acquires TodoList {
// gets the signer address
let signer_address = signer::address_of(account);
// assert signer has created a list
assert!(exists(signer_address), ENOT_INITIALIZED);
...
}
public entry fun complete_task(account: &signer, task_id: u64) acquires TodoList {
// gets the signer address
let signer_address = signer::address_of(account);
assert!(exists(signer_address), ENOT_INITIALIZED);
// gets the TodoList resource
let todo_list = borrow_global_mut(signer_address);
// assert task exists
assert!(table::contains(&todo_list.tasks, task_id), ETASK_DOESNT_EXIST);
// gets the task matched the task_id
let task_record = table::borrow_mut(&mut todo_list.tasks, task_id);
// assert task is not completed
assert!(task_record.completed == false, ETASK_IS_COMPLETED);
// update task as completed
task_record.completed = true;
}
```
**WONDERFUL!!**
Let’s stop for one moment and make sure our code compiles by running the `npm run move:compile` command. If all goes well, we should output resembling:
```shellscript
Compiling, may take a little while to download git dependencies...
UPDATING GIT DEPENDENCY https://github.com/aptos-labs/aptos-core.git
INCLUDING DEPENDENCY AptosFramework
INCLUDING DEPENDENCY AptosStdlib
INCLUDING DEPENDENCY MoveStdlib
BUILDING MessageBoard
{
"Result": [
"1cecfef9e239eff12fb1a3d189a121c37f48908d86c0e9c02ec103e0a05ddebb::todolist"
]
}
```
If you encounter errors, make sure you followed the steps above correctly and try to determine the cause of the issues.
### Write tests
[Section titled “Write tests”](#write-tests)
Now that we have our smart contract logic ready, we need to add some tests for it.
First, delete the `test_end_to_end.move` file in the `tests` directory, as we won’t be using it.
1. For simplicity, and because we don’t have much code to test, we use one function to test the whole flow of the app and will have it in the `todolist.move` file. The test steps are:
```move
// create a list
// create a task
// update task as completed
```
2. Add the following code to the bottom of the `todolist.move` file:
```move
#[test]
public entry fun test_flow() {
}
```
Note: Test functions use the `#[test]` annotation.
Note
we need to use `entry` here because we are testing an `entry` function.
3. Update the test function to be:
```move
#[test(admin = @0x123)]
public entry fun test_flow(admin: signer) acquires TodoList {
// creates an admin @todolist_addr account for test
account::create_account_for_test(signer::address_of(&admin));
// initialize contract with admin account
create_list(&admin);
// creates a task by the admin account
create_task(&admin, string::utf8(b"New Task"));
let task_count = event::counter(&borrow_global(signer::address_of(&admin)).set_task_event);
assert!(task_count == 1, 4);
let todo_list = borrow_global(signer::address_of(&admin));
assert!(todo_list.task_counter == 1, 5);
let task_record = table::borrow(&todo_list.tasks, todo_list.task_counter);
assert!(task_record.task_id == 1, 6);
assert!(task_record.completed == false, 7);
assert!(task_record.content == string::utf8(b"New Task"), 8);
assert!(task_record.address == signer::address_of(&admin), 9);
// updates task as completed
complete_task(&admin, 1);
let todo_list = borrow_global(signer::address_of(&admin));
let task_record = table::borrow(&todo_list.tasks, 1);
assert!(task_record.task_id == 1, 10);
assert!(task_record.completed == true, 11);
assert!(task_record.content == string::utf8(b"New Task"), 12);
assert!(task_record.address == signer::address_of(&admin), 13);
}
```
Our `#[test]` annotation has changed and declares an account variable.
Additionally, the function itself now accepts a signer argument.
**Let’s understand our tests.**
Since our tests run outside an account scope, we need to *create* accounts to use in our tests. The `#[test]` annotation gives us the option to declare those accounts. We use an `admin` account and set it to a random account address (`@0x123`). The function accepts this signer (account) and creates it by using a built-in function to create an account for test.
Then we simply go through the flow by:
* creating a list
* creating a task
* updating a task as completed
And assert the expected data/behavior at each step.
Before running the tests again, we need to import (`use`) some new modules we are now employing in our code:
3. At the top of the file, add this `use` statement:
```move
use std::string::{Self, String}; // already have it, need to modify
```
4. Run the `npm run move:test` command. If all goes right, we should see a success message like:
```move
Running Move unit tests
[ PASS ] 0x1cecfef9e239eff12fb1a3d189a121c37f48908d86c0e9c02ec103e0a05ddebb::todolist::test_flow
Test result: OK. Total tests: 1; passed: 1; failed: 0
{
"Result": "Success"
}
```
5. Let’s add one more test to make sure our `complete_task` function works as expected. Add another test function with:
```move
#[test(admin = @0x123)]
#[expected_failure(abort_code = ENOT_INITIALIZED)]
public entry fun account_can_not_update_task(admin: signer) acquires TodoList {
// creates an admin @todolist_addr account for test
account::create_account_for_test(signer::address_of(&admin));
// account can not toggle task as no list was created
complete_task(&admin, 2);
}
```
This test confirms that an account can’t use that function if they haven’t created a list before.
The test also uses a special annotation `#[expected_failure]` that, as the name suggests, expects to fail with an `ENOT_INITIALIZED` error code.
6. Run the `aptos move test` command. If all goes right, we should see a success message like:
```shellscript
Running Move unit tests
[ PASS ] 0x1cecfef9e239eff12fb1a3d189a121c37f48908d86c0e9c02ec103e0a05ddebb::todolist::account_can_not_update_task
[ PASS ] 0x1cecfef9e239eff12fb1a3d189a121c37f48908d86c0e9c02ec103e0a05ddebb::todolist::test_flow
Test result: OK. Total tests: 2; passed: 2; failed: 0
{
"Result": "Success"
}
```
Now that everything works, we can compile the Move modules and publish the Move package to chain so our React app (and everyone else) can interact with our smart contract!
### Publish todolist module to chain
[Section titled “Publish todolist module to chain”](#publish-todolist-module-to-chain)
1. Run: `npm run move:compile`
We are getting some *Unused alias* errors. This is because we added the `string` alias before since we use it in our tests. But we don’t use this alias in our smart contract code.
This is why we are getting this error when we want to compile the module but not are getting it when we only run tests.
To fix it, we can add a `use` statement that would be used only in tests.
Add the following `use` statement where we have all of our import statements.
```move
use std::string::String; // change to this
...
#[test_only]
use std::string; // add this
```
2. Run: `npm run move:test` and `npm run move:compile` - all should work without errors.
3. Run: `npm run move:publish`
4. Enter `yes` in the prompt.
**Oh no! We got an error!**
It complains about an account mismatch. Apparently we compiled the package with a different account we try to publish it.
Let’s fix it.
1. Open the `scripts/move/publish.js` file.
2. Update the `addressName` variable value to be `todolist_addr`.
That will use the same account we used for compiling the package.
Let’s try again:
1. Run: `npm run move:publish`
2. Enter `yes` in the prompt.
3. Enter `yes` in the second prompt.
4. That will compile, simulate and finally publish your module into devnet. You should see a success message:
```shellscript
Transaction submitted: https://explorer.aptoslabs.com/txn/0x68dadf24b9ec29b9c32bd78836d20032de615bbef5f10db580228577f7ca945a?network=devnet
Code was successfully deployed to object address 0x2bce4f7bb8a67641875ba5076850d2154eb9621b0c021982bdcd80731279efa6
{
"Result": "Success"
}
```
6. You can now head to the [Aptos Explorer](https://explorer.aptoslabs.com/) link and view the transaction details. You can also see the module published on chain by looking for the object address.
Note
Check out your `.env` file and see the `VITE_MODULE_ADDRESS` variable, it is set to the object address of the published module.
Now let’s [set up the frontend](/build/guides/build-e2e-dapp/2-set-up-the-frontend) in chapter 2.
# 2. Set up the frontend
This is the second chapter of the tutorial on [building an end-to-end dapp on Aptos](/build/guides/build-e2e-dapp) where you have already [created a smart contract](/build/guides/build-e2e-dapp/1-create-smart-contract) and are now setting up the frontend.
## Set up the frontend
[Section titled “Set up the frontend”](#set-up-the-frontend)
`create-aptos-dapp` has already created the frontend for us with a basic layout and Wallet implementation using the `aptos-wallet-adapter` library.
1. Run: `npm run dev`
At this point you should have your app running on , which displays the default template layout.
2. In the `frontend` directory, find all the frontend files. Let’s clean it up a bit.
3. Open the `App.tsx` file and update its content to be:
```typescript
import { Header } from "@/components/Header";
import { TopBanner } from "@/components/TopBanner";
function App() {
return (
<>
My app goes here
>
);
}
export default App;
```
Once you save the changes, you should see that the app content has changed in the browser and displays `My app goes here`.
## Our dapp UI
[Section titled “Our dapp UI”](#our-dapp-ui)
First we will build the dapp UI layout. We have two UI states for the app:
* When an account hasn’t created a list yet (on the left).
* When an account has created a list and can now add tasks to it (on the right). 
We now have a working client with a Wallet connect button and a wallet selector modal. Feel free to play with it and connect a wallet with it.
Then learn how to [fetch data from chain](/build/guides/build-e2e-dapp/3-fetch-data-from-chain) in chapter 3.
# 3. Fetch Data from Chain
In the third chapter of the tutorial on [building an end-to-end dapp on Aptos](/build/guides/build-e2e-dapp), you will be learning to fetch data from chain.
Our UI logic relies on whether the connected account has created a todo list. If the account has created a todo list, our app should display that list; if not, the app should display a button offering the option to create a new list.
For that, we first need to check if the connected account has a `TodoList` resource. In our smart contract, whenever someone creates a todo list we create and assign a `TodoList` resource to their account.
To fetch data from chain, we can use the [Aptos TypeScript SDK](/build/sdks/ts-sdk). The SDK provides classes and functions for us to easily interact and query the Aptos chain.
To get started:
1. Stop the local server if running.
2. Import wallet from the wallet adapter React provider:
```tsx
import { useWallet } from "@aptos-labs/wallet-adapter-react";
```
2. Extract the account object from the wallet adapter:
```tsx
function App (
const { account } = useWallet();
)
```
The `account` object is `null` if there is no account connected; when an account is connected, the `account` object holds the account information, including the account address.
3. Next, we want to fetch the account’s TodoList resource. Begin by importing `useEffect` by using `import { useEffect } from "react";` Let’s add a `useEffect` hook to our file that would call a function to fetch the resource whenever our account address changes:
```tsx
function App() {
import { useEffect } from "react"
...
useEffect(() => {
fetchList();
}, [account?.address]);
...
}
```
4. Before creating our `fetchList` function, let’s also create a local state to store whether the account has a list:
```tsx
function App (
...
const [accountHasList, setAccountHasList] = useState(false);
...
)
```
also import `useState` using `import { useState, useEffect } from "react";`
5. Import `MODULE_ADDRESS` variable using `import { MODULE_ADDRESS } from "./constants";`. This is the address of the module we published in the previous chapter.
6. Import `aptosClient` using `import { aptosClient } from "./utils/aptosClient";`. This is a client `create-aptos-dapp` created for us to interact with the chain.
7. Our `useEffect` hook is calling a `fetchList` function; let’s create it:
```tsx
const fetchList = async () => {
if (!account) return [];
const moduleAddress = MODULE_ADDRESS;
try {
const todoListResource = await aptosClient().getAccountResource(
{
accountAddress:account?.address,
resourceType:`${moduleAddress}::todolist::TodoList`
}
);
setAccountHasList(true);
} catch (e: any) {
setAccountHasList(false);
}
};
```
The `aptosClient().getAccountResource()` expects an *account address* that holds the resource we are looking for and a string representation of an on-chain *Move struct type*.
* account address - is the current connected account (we are getting it from the wallet account object)
* Move struct type string syntax:
* The account address who holds the move module
* The module name the resource lives in = `todolist`
* The resource name = `TodoList`
If the request succeeds and there is a resource for that account, we want to set our local state to `true`; otherwise, we would set it to `false`.
7. Let’s update our UI based on the `accountHasList` state:
```tsx
return (
<>
{!accountHasList && (
)}
>
);
```
We now have an **Add new list** button that appears only if the account doesn’t have a list.
Start the local server with `npm run dev`. You should see the **Add new list** button.
Next, let’s understand how to create a new list by [submitting data to chain](/build/guides/build-e2e-dapp/4-submit-data-to-chain) in chapter 4.
# 4. Submit Data to Chain
In the fourth chapter of the tutorial on [building an end-to-end dapp on Aptos](/build/guides/build-e2e-dapp), you will be submitting data to the chain.
So now we have an **Add new list** button that appears if the connected account hasn’t created a list yet. We still don’t have a way for an account to create a list, so let’s add that functionality.
1. First, our wallet adapter provider has a `signAndSubmitTransaction` function; let’s extract it by updating the following:
```tsx
const { account, signAndSubmitTransaction } = useWallet();
```
2. Add an `onClick` event to the new list button:
```tsx
```
3. Update the import statement from `@aptos-labs/wallet-adapter-react` to also import the `InputTransactionData` type and
```tsx
import {
useWallet,
InputTransactionData,
} from "@aptos-labs/wallet-adapter-react";
```
4. Add the `addNewList` function:
```tsx
const addNewList = async () => {
if (!account) return [];
const transaction:InputTransactionData = {
data: {
function:`${moduleAddress}::todolist::create_list`,
functionArguments:[]
}
}
try {
// sign and submit transaction to chain
const response = await signAndSubmitTransaction(transaction);
// wait for transaction
await aptosClient().waitForTransaction({transactionHash:response.hash});
setAccountHasList(true);
} catch (error: any) {
setAccountHasList(false);
}
};
```
5. Since our new function also uses `moduleAddress`, let’s get it out of the `fetchList` function scope to the global scope so it can be used globally.
In our `fetchList` function, find the line:
```tsx
const moduleAddress = MODULE_ADDRESS;
```
And move it to outside of the main `App` function, so it can be globally accessed.
**Let’s go over the `addNewList` function code.**
First, we use the `account` property from our wallet provider to make sure there is an account connected to our app.
Then we build our transaction data to be submitted to chain:
```tsx
const transaction:InputTransactionData = {
data: {
function:`${moduleAddress}::todolist::create_list`,
functionArguments:[]
}
}
```
* `function`- is built from the module address, module name and the function name.
* `functionArguments` - the arguments the function expects, in our case it doesn’t expect any arguments.
Next, we submit the transaction payload and wait for its response. The response returned from the `signAndSubmitTransaction` function holds the transaction hash. Since it can take a bit for the transaction to be fully executed on chain and we also want to make sure it is executed successfully, we `waitForTransaction`. And only then we can set our local `accountHasList` state to `true`.
6. Before testing our app, let’s tweak our UI a bit and add a Spinner component to show up while we are waiting for the transaction. Add a local state to keep track whether a transaction is in progress:
```tsx
const [transactionInProgress, setTransactionInProgress] =
useState(false);
```
7. Update our `addNewList` function to update the local state:
```tsx
const addNewList = async () => {
if (!account) return [];
setTransactionInProgress(true);
const transaction:InputTransactionData = {
data: {
function:`${moduleAddress}::todolist::create_list`,
functionArguments:[]
}
}
try {
// sign and submit transaction to chain
const response = await signAndSubmitTransaction(transaction);
// wait for transaction
await aptosClient().waitForTransaction({transactionHash:response.hash});
setAccountHasList(true);
} catch (error: any) {
setAccountHasList(false);
} finally {
setTransactionInProgress(false);
}
};
```
9. Update our UI with the following:
```tsx
return (
<>
...
{!accountHasList && (
)}
>
);
```
Now you can head over to our app, and add a new list!
Since you haven’t made the user interface able to handle cases where an account has created a list, you will do so next [handling tasks](/build/guides/build-e2e-dapp/5-handle-tasks) in chapter 5.
# 5. Handle Tasks
In the fifth and final chapter of the tutorial on [building an end-to-end dapp on Aptos](/build/guides/build-e2e-dapp), you will add functionality to the app so the user interface is able to handle cases where an account has created a list.
We have covered how to [fetch data](/build/guides/build-e2e-dapp/3-fetch-data-from-chain) (an account’s todo list) from chain and how to [submit a transaction](/build/guides/build-e2e-dapp/4-submit-data-to-chain) (new todo list) to chain using Wallet.
Let’s finish building our app by implementing fetch tasks and adding a task function.
## Fetch tasks
[Section titled “Fetch tasks”](#fetch-tasks)
1. Create a local state `tasks` that will hold our tasks. It will be a state of a Task type (that has the same properties we set on our smart contract):
```typescript
type Task = {
address: string;
completed: boolean;
content: string;
task_id: string;
};
function App() {
const [tasks, setTasks] = useState([]);
...
}
```
2. Update our `fetchList` function to fetch the tasks in the account’s `TodoList` resource:
```typescript
const fetchList = async () => {
if (!account) return [];
try {
const todoListResource = await aptosClient().getAccountResource({
accountAddress:account?.address,
resourceType:`${moduleAddress}::todolist::TodoList`
});
setAccountHasList(true);
// tasks table handle
const tableHandle = (todoListResource as any).tasks.handle;
// tasks table counter
const taskCounter = (todoListResource as any).task_counter;
let tasks = [];
let counter = 1;
while (counter <= taskCounter) {
const tableItem = {
key_type: "u64",
value_type: `${moduleAddress}::todolist::Task`,
key: `${counter}`,
};
const task = await aptosClient().getTableItem({handle:tableHandle, data:tableItem});
tasks.push(task);
counter++;
}
// set tasks in local state
setTasks(tasks);
} catch (e: any) {
setAccountHasList(false);
}
};
```
**This part is a bit confusing, so stick with us!**
Tasks are stored in a table (this is how we built our contract). To fetch a table item (i.e a task), we need that task’s table handle. We also need the `task_counter` in that resource so we can loop over and fetch the task with the `task_id` that matches the `task_counter`.
```typescript
const tableHandle = (TodoListResource as any).data.tasks.handle;
const taskCounter = (TodoListResource as any).data.task_counter;
```
Now that we have our tasks table handle and our `task_counter` variable, lets loop over the `taskCounter` . We define a `counter` and set it to 1 as the task\_counter / task\_id is never less than 1.
We loop while the `counter` is less then the `taskCounter` and fetch the table item and push it to the tasks array:
```typescript
let tasks = [];
let counter = 1;
while (counter <= taskCounter) {
const tableItem = {
key_type: "u64",
value_type: `${moduleAddress}::todolist::Task`,
key: `${counter}`,
};
const task = await aptosClient().getTableItem(tableHandle, tableItem);
tasks.push(task);
counter++;
}
```
We build a `tableItem` object to fetch. If we take a look at our table structure from the contract:
```typescript
tasks: Table,
```
We see that it has a `key` type `u64` and a `value` of type `Task`. And whenever we create a new task, we assign the `key` to be the incremented task counter.
```move
// adds the new task into the tasks table
table::upsert(&mut todo_list.tasks, counter, new_task);
```
So the object we built is:
```typescript
{
key_type: "u64",
value_type:`${moduleAddress}::todolist::Task`,
key: `${taskCounter}`,
}
```
Where `key_type` is the table `key` type, `key` is the key value we are looking for, and the `value_type` is the table `value` which is a `Task` struct. The Task struct uses the same format from our previous resource query:
* The account address who holds that module = our profile account address
* The module name the resource lives in = `todolist`
* The struct name = `Task`
The last thing we want to do is display the tasks we just fetched.
3. In our `App.tsx` file, update our UI with the following code:
Import the `Input` using `import { Input } from "./components/ui/input";`
```tsx
{!accountHasList ? (
) : (
)}
```
That will display the **Add new list** button if account doesn’t have a list or instead the tasks if the account has a list.
Go ahead and refresh your browser - see the magic!
We haven’t added any tasks yet, so we dont see anything. Let’s add the option to add some tasks!
## Add task
[Section titled “Add task”](#add-task)
1. Update our UI with an *add task* input:
```tsx
{!accountHasList ? (
...
) : (