[ My apologies if this is off-topic here. I’m a newcomer to holochain. ]
I am struggling to understand how the Rust code for a Zome is converted into WASM structures and how then WASM communicates with the rest of the world. I’ve found bits and pieces of this here and there, but so far no comprehensive explanation. Below is my current take of this, with some questions.
1. Rust macros
There seems to be two layers of Rust macros, those in HDK and those in HDKv2
If I understand those correctly, first HDKv2 adds to a [#zome]
module a struct ZomeCodeDef
:
pub struct ZomeCodeDef {
pub init: InitCallback,
pub validate_agent: ValidateAgentCallback,
pub zome_fns: ZomeFunctions,
pub entry_def_fns: Vec<syn::ItemFn>,
pub traits: ZomeTraits,
pub receive_callback: Option<ReceiveCallback>,
pub extra: Vec<syn::Item>,
}
Then, at the HDK level, some of these apparently get converted into pub extern "C"
functions, cf. init
.
Am I right? Or are HDK and HDKv2 completely separate and independent? If that is the case, how does then the external world talk with the HDKv2 callbacks?
2. Glueing the external world to the WASM runtime
Anyway, at some point the Zome definitions need to provide the WASM runtime a set of functions it can call. The following the code in th HDK, for init
this seems to define a public extern "C" fn receive
, which apparently gets called by the underlying runtime when initialising the Zome.
I’m including a simplified version of the unexpanded receive
here:
pub extern "C" fn receive(
allocation_of_input: RibosomeEncodingBits) -> RibosomeEncodingBits
{
let init = init_global_memory(allocation_of_input);
fn execute() -> Result<(), String> {
$init_expr
}
match execute() {
Ok(_) => Success.into(),
Err(e) => …
}
}
Hence, the code simply seems call the locally defined execute
(which in this case is expanded from the $init_expr
part of the macro), and either return Ok
or Err
encoded into RibosomeEncodingBits
. Elsewhere there seems to be more encoding and decoding going on.
I presume the same pattern is followed everywhere when the runtime calls the WASM code:
- Decode
RibosomeEncodingBits
into JSON (or something else) - Call a locally defined
execute
- Encode result into
RibosomeEncodingBits
Is that right?
Now, what I haven’t found yet is how the WASM code calls the runtime, nor the runtime side implementation of the encoding and decoding. Any pointers to that?