Skip to content

Commit

Permalink
Merge branch 'master' into backport-tests-for-claim-assets
Browse files Browse the repository at this point in the history
  • Loading branch information
Zihan Zhao authored and Zihan Zhao committed Jul 10, 2024
2 parents 988a1a6 + 9403a5d commit 35804f8
Show file tree
Hide file tree
Showing 8 changed files with 107 additions and 2 deletions.
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
# Candidate Backing

> NOTE: This module has suffered changes for the elastic scaling implementation. As a result, parts of this document may
be out of date and will be updated at a later time. Issue tracking the update:
https://github.com/paritytech/polkadot-sdk/issues/3699

The Candidate Backing subsystem ensures every parablock considered for relay block inclusion has been seconded by at
least one validator, and approved by a quorum. Parablocks for which not enough validators will assert correctness are
discarded. If the block later proves invalid, the initial backers are slashable; this gives Polkadot a rational threat
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
# Prospective Parachains

> NOTE: This module has suffered changes for the elastic scaling implementation. As a result, parts of this document may
be out of date and will be updated at a later time. Issue tracking the update:
https://github.com/paritytech/polkadot-sdk/issues/3699

## Overview

**Purpose:** Tracks and handles prospective parachain fragments and informs
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
# Collator Protocol

> NOTE: This module has suffered changes for the elastic scaling implementation. As a result, parts of this document may
be out of date and will be updated at a later time. Issue tracking the update:
https://github.com/paritytech/polkadot-sdk/issues/3699

The Collator Protocol implements the network protocol by which collators and validators communicate. It is used by
collators to distribute collations to validators and used by validators to accept collations by collators.

Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
# Provisioner

> NOTE: This module has suffered changes for the elastic scaling implementation. As a result, parts of this document may
be out of date and will be updated at a later time. Issue tracking the update:
https://github.com/paritytech/polkadot-sdk/issues/3699

Relay chain block authorship authority is governed by BABE and is beyond the scope of the Overseer and the rest of the
subsystems. That said, ultimately the block author needs to select a set of backable parachain candidates and other
consensus data, and assemble a block from them. This subsystem is responsible for providing the necessary data to all
Expand Down
4 changes: 4 additions & 0 deletions polkadot/roadmap/implementers-guide/src/runtime/inclusion.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
# Inclusion Pallet

> NOTE: This module has suffered changes for the elastic scaling implementation. As a result, parts of this document may
be out of date and will be updated at a later time. Issue tracking the update:
https://github.com/paritytech/polkadot-sdk/issues/3699

The inclusion module is responsible for inclusion and availability of scheduled parachains. It also manages the UMP
dispatch queue of each parachain.

Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
# `ParaInherent`

> NOTE: This module has suffered changes for the elastic scaling implementation. As a result, parts of this document may
be out of date and will be updated at a later time. Issue tracking the update:
https://github.com/paritytech/polkadot-sdk/issues/3699

This module is responsible for providing all data given to the runtime by the block author to the various parachains
modules. The entry-point is mandatory, in that it must be invoked exactly once within every block, and it is also
"inherent", in that it is provided with no origin by the block author. The data within it carries its own
Expand Down
67 changes: 65 additions & 2 deletions polkadot/xcm/src/v2/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,10 @@ use super::{
};
use alloc::{vec, vec::Vec};
use bounded_collections::{ConstU32, WeakBoundedVec};
use codec::{self, Decode, Encode, MaxEncodedLen};
use codec::{
self, decode_vec_with_len, Compact, Decode, Encode, Error as CodecError, Input as CodecInput,
MaxEncodedLen,
};
use core::{fmt::Debug, result};
use derivative::Derivative;
use scale_info::TypeInfo;
Expand Down Expand Up @@ -278,14 +281,39 @@ pub const VERSION: super::Version = 2;
pub type QueryId = u64;

/// DEPRECATED. Please use XCMv3 or XCMv4 instead.
#[derive(Derivative, Default, Encode, Decode, TypeInfo)]
#[derive(Derivative, Default, Encode, TypeInfo)]
#[derivative(Clone(bound = ""), Eq(bound = ""), PartialEq(bound = ""), Debug(bound = ""))]
#[codec(encode_bound())]
#[codec(decode_bound())]
#[scale_info(bounds(), skip_type_params(RuntimeCall))]
#[scale_info(replace_segment("staging_xcm", "xcm"))]
pub struct Xcm<RuntimeCall>(pub Vec<Instruction<RuntimeCall>>);

environmental::environmental!(instructions_count: u8);

impl<Call> Decode for Xcm<Call> {
fn decode<I: CodecInput>(input: &mut I) -> core::result::Result<Self, CodecError> {
instructions_count::using_once(&mut 0, || {
let number_of_instructions: u32 = <Compact<u32>>::decode(input)?.into();
instructions_count::with(|count| {
*count = count.saturating_add(number_of_instructions as u8);
if *count > MAX_INSTRUCTIONS_TO_DECODE {
return Err(CodecError::from("Max instructions exceeded"))
}
Ok(())
})
.unwrap_or(Ok(()))?;
let decoded_instructions = decode_vec_with_len(input, number_of_instructions as usize)?;
Ok(Self(decoded_instructions))
})
}
}

/// The maximal number of instructions in an XCM before decoding fails.
///
/// This is a deliberate limit - not a technical one.
pub const MAX_INSTRUCTIONS_TO_DECODE: u8 = 100;

impl<RuntimeCall> Xcm<RuntimeCall> {
/// Create an empty instance.
pub fn new() -> Self {
Expand Down Expand Up @@ -1157,3 +1185,38 @@ impl<RuntimeCall> TryFrom<NewInstruction<RuntimeCall>> for Instruction<RuntimeCa
})
}
}

#[cfg(test)]
mod tests {
use super::{prelude::*, *};

#[test]
fn decoding_respects_limit() {
let max_xcm = Xcm::<()>(vec![ClearOrigin; MAX_INSTRUCTIONS_TO_DECODE as usize]);
let encoded = max_xcm.encode();
assert!(Xcm::<()>::decode(&mut &encoded[..]).is_ok());

let big_xcm = Xcm::<()>(vec![ClearOrigin; MAX_INSTRUCTIONS_TO_DECODE as usize + 1]);
let encoded = big_xcm.encode();
assert!(Xcm::<()>::decode(&mut &encoded[..]).is_err());

let nested_xcm = Xcm::<()>(vec![
DepositReserveAsset {
assets: All.into(),
dest: Here.into(),
xcm: max_xcm,
max_assets: 1,
};
(MAX_INSTRUCTIONS_TO_DECODE / 2) as usize
]);
let encoded = nested_xcm.encode();
assert!(Xcm::<()>::decode(&mut &encoded[..]).is_err());

let even_more_nested_xcm = Xcm::<()>(vec![SetAppendix(nested_xcm); 64]);
let encoded = even_more_nested_xcm.encode();
assert_eq!(encoded.len(), 345730);
// This should not decode since the limit is 100
assert_eq!(MAX_INSTRUCTIONS_TO_DECODE, 100, "precondition");
assert!(Xcm::<()>::decode(&mut &encoded[..]).is_err());
}
}
18 changes: 18 additions & 0 deletions prdoc/pr_4978.prdoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
# Schema: Polkadot SDK PRDoc Schema (prdoc) v1.0.0
# See doc at https://raw.githubusercontent.com/paritytech/polkadot-sdk/master/prdoc/schema_user.json

title: Add MAX_INSTRUCTIONS_TO_DECODE to XCMv2

doc:
- audience: Runtime User
description: |
Added a max number of instructions to XCMv2. If using XCMv2, you'll have to take this limit into account.
It was set to 100.
- audience: Runtime Dev
description: |
Added a max number of instructions to XCMv2. If using XCMv2, you'll have to take this limit into account.
It was set to 100.

crates:
- name: staging-xcm
bump: minor

0 comments on commit 35804f8

Please sign in to comment.