opengradient-neuroml
v1.0.2
Published
A Solidity library for using inference on OpenGradient
Downloads
25
Maintainers
Readme
OpenGradient NeuroML Library
NeuroML library is a set of Solidity interfaces and precompiles that allow smart contract developers to use OpenGradient native inference capabilities directly from smart contracts. Developers can run inference AI and ML models through a simple function call executed atomically within the same transaction.
Benefits
The main benefits of running inference through NeuroML include:
Atomic execution: You can atomically execute inferences as part of the EVM transaction that triggers it; this makes it easier to ensure state consistency
Simple interface: You can run inferences through a simple function call without the need for callback functions and handlers
Composability: through the use of smart contract transactions, you can chain together multiple models using arbitrarily complex logic - supporting advanced real-world use cases
Native verification: The OpenGradient network protocol natively validates inference validity proofs (e.g., ZKML and TEE). This means that smart contracts can trust the results without explicit verification.
Installation
You can install NeuroML library by running:
npm i opengradient-neuroml
NeuroML Example
pragma solidity ^0.8.10;
import "opengradient-neuroml/src/OGInference.sol";
contract Test {
ModelInput memory modelInput = ModelInput(
new TensorLib.MultiDimensionalNumberTensor[](1),
new TensorLib.StringTensor[](0));
TensorLib.Number[] memory numbers = new TensorLib.Number[](2);
numbers[0] = TensorLib.Number(7286679744720459, 17); // 0.07286679744720459
numbers[1] = TensorLib.Number(4486280083656311, 16); // 0.4486280083656311
modelInput.numbers[0] = TensorLib.numberTensor1D("input", numbers);
ModelOutput memory output = OG_INFERENCE_CONTRACT.runModelInference(
ModelInferenceRequest(
ModelInferenceMode.ZK, "QmbbzDwqSxZSgkz1EbsNHp2mb67rYeUYHYWJ4wECE24S7A",
modelInput));
}
Read NeuroML library documentation and how-to guides here.