tf-a-next: changes to build TF-A from external repo
This is a first cut into the CI scripts to build RF-A the "new" way,
aligned with the changes proposed in RF-A change #36846. There are a
number of hacks (detailed below), which should be cleaned up in the
future, but this first set of changes will allow us to transition sooner
to the new RF-A repository before we get the final, proper solution
implemented.
Ultimately, the proper solution should introduce RF-A as a first-class
citizen component to build, alongside TF-A, TFTF, SPM and others. This
involves redefining the test configuration naming convention into
something like <tf-config>-<rfa-config>-<tftf-config>/<run_config>,
cloning the RF-A repository the proper way, and so on.
For now, this patch vampirizes most of TF-A build machinery, as
most of it can be reused for RF-A as well. Of course, some things need
to differ so the scripts need a way to know that they are actually
building RF-A, not TF-A. To do so, RF-A build configuration files
(under `tf_config/`) must contain `RUST=1`. Even though RF-A build
system no longer makes use of this build flag, the scripts now rely on
it for conditional logic like this:
if upon "$(get_tf_opt RUST)"; then
# Do something specific to RF-A.
# This code won't be executed for a TF-A build.
fi
Additionally, this patch removes the FIP building logic from the FVP
test configuration files (under group/) since RF-A's build system now
takes care of it.
Finally, add `TFA_FLAGS="FVP_TRUSTED_SRAM_SIZE=512"` into the FVP
platform build configuration file. This will make RF-A's build system
compile TF-A with `FVP_TRUSTED_SRAM_SIZE=512`, which is required for
debug builds of RF-A to fit in memory.
With all these changes, one can use the following command line to build
RF-A for FVP and QEMU through TF-A CI scripts:
TFA=</path/to/tfa/> \
tf_root=</path/to/rfa/> \
test_groups=tf-next-build \
dont_clean=0 \
run_test.sh
(`dont_clean=0` is required because neither TF-A nor RF-A build system
track build options changes, like `PLAT`),
And to build and run on FVP:
TFA=</path/to/tfa/> \
tf_root=</path/to/rfa/> \
test_groups=rfa-fvp/fvp-next:fvp-next \
run_test.sh
run_test.sh is a script wrapper over script/run_local_ci.sh, here is an example:
---8<---
export workspace="${workspace:-/tmp/ws}"
rm -rf $workspace
# Your Base FVP tree needs to be accessible through $nfs_volume/warehouse/
export nfs_volume="FILL ME"
export TFA="${TFA-/path/to/tfa}" # Leave empty if you want a fresh clone.
export tf_root="${tf_root-/path/to/rfa}" # Hack: This is RF-A, not TF-A!
export ci_root="${ci_root-/path/to/tf-a-ci-scripts}"
# Do not clone the other repositories below.
export tftf_root=none
export spm_root=none
export rmm_root=none
export scp_root=none
export tfm_tests_root=none
export tfm_extras_root=none
"$ci_root/script/run_local_ci.sh"
---8<---
The fact that tf_root must actually point to the RF-A repo (not the TF-A
one) is a consequence of some of the hacks to reuse TF-A build
machinery.
TFA variable might be omitted, in which case the scripts will clone the
TF-A repository for you. It will then appear under the workspace
(/tmp/ws/tfa by default).
Change-Id: Id19885b949643b680ef2026aaa192fd90203f943
Signed-off-by: Sandrine Afsa <sandrine.afsa@arm.com>
diff --git a/script/build_package.sh b/script/build_package.sh
index a6e99ec..758d083 100755
--- a/script/build_package.sh
+++ b/script/build_package.sh
@@ -228,6 +228,18 @@
fi
}
+collect_rfa_artefacts() {
+ if [ ! -d "${from:?}" ]; then
+ return
+ fi
+
+ if ! find "$from" -maxdepth 1 \( -name "*.bin" -o -name '*.elf' \) -exec cp -t "${to:?}" '{}' +; then
+ echo "You probably are running local CI on local repositories."
+ echo "Did you set 'dont_clean' but forgot to run 'distclean'?"
+ die
+ fi
+}
+
# Map the UART ID used for expect with the UART descriptor and port
# used by the FPGA automation tools.
map_uart() {
@@ -610,6 +622,47 @@
)
}
+build_rfa() {
+ (
+ config_file="${tf_build_config:-$tf_config_file}"
+
+ # Build the 'all' target by default.
+ build_targets="${tf_build_targets:-all}"
+
+ source "$config_file"
+
+ cd "$tf_root/rust"
+
+ # Always distclean when running on Jenkins. Skip distclean when running
+ # locally and explicitly requested.
+ if upon "$jenkins_run" || not_upon "$dont_clean"; then
+ echo "Cleaning TF-A..."
+ make -C "$TFA" distclean &>>"$build_log" || fail_build
+
+ echo 'Cleaning RF-A...'
+ cargo clean &>>"$build_log" || fail_build
+ fi
+
+ # Log build command line. It is left unfolded on purpose to assist
+ # copying to clipboard.
+ cat <<EOF | log_separator >/dev/null
+
+Build command line:
+ make $make_j_opts $(cat "$config_file" | tr '\n' ' ') DEBUG=$DEBUG $build_targets
+
+cargo version:
+$(cargo --version 2>&1)
+EOF
+
+ # Build RF-A and TF-A. Since build output is being directed to the build
+ # log, have descriptor 3 point to the current terminal for build
+ # wrappers to vent.
+ make $make_j_opts $(cat "$config_file") \
+ DEBUG="$DEBUG" \
+ $build_targets 3>&1 &>>"$build_log" || fail_build
+ )
+}
+
build_tftf() {
(
config_file="${tftf_build_config:-$tftf_config_file}"
@@ -1679,38 +1732,60 @@
fvp_tsram_size="$(get_tf_opt FVP_TRUSTED_SRAM_SIZE)"
fvp_tsram_size="${fvp_tsram_size:-256}"
- poetry -C "$tf_root" install --without docs
+ if not_upon "$(get_tf_opt RUST)"; then
+ poetry -C "$tf_root" install --without docs
+ fi
archive="$build_archive"
- tf_build_root="$tf_root/build"
- echo "Building Trusted Firmware ($mode) ..." |& log_separator
+ if not_upon "$(get_tf_opt RUST)"; then
+ tf_build_root="$tf_root/build"
+ echo "Building Trusted Firmware ($mode) ..." |& log_separator
+ else
+ # Clone TF-A repo if required. Save its path into the
+ # special variable "TFA", which is used by RF-A build
+ # system.
+ export TFA="${TFA-$workspace/tfa}"
+ if assert_can_git_clone "TFA"; then
+ echo "Cloning TF-A..."
+ clone_url="$tf_src_repo_url" where="$TFA" clone_repo
+ fi
+ show_head "$TFA"
+ poetry -C "$TFA" install --without docs
- if upon "$(get_tf_opt RUST)" && not_upon "$local_ci"; then
- # In the CI Dockerfile, rustup is installed by the root user in the
- # non-default location /usr/local/rustup, so $RUSTUP_HOME is required to
- # access rust config e.g. default toolchains and run cargo
- #
- # Leave $CARGO_HOME blank so when this script is run in CI by the buildslave
- # user, it uses the default /home/buildslave/.cargo directory which it has
- # write permissions for - that allows it to download new crates during
- # compilation
- #
- # The buildslave user does not have write permissions to the default
- # $CARGO_HOME=/usr/local/cargo dir and so will error when trying to download
- # new crates otherwise
- #
- # note: $PATH still contains /usr/local/cargo/bin at this point so cargo is
- # still run via the root installation
- #
- # see https://github.com/rust-lang/rustup/issues/1085
- set_hook_var "RUSTUP_HOME" "/usr/local/rustup"
+ rfa_build_root="$tf_root/rust/target"
+ echo "Building Rusted Firmware ($mode) ..." |& log_separator
+
+ if not_upon "$local_ci"; then
+ # In the CI Dockerfile, rustup is installed by the root user in the
+ # non-default location /usr/local/rustup, so $RUSTUP_HOME is required to
+ # access rust config e.g. default toolchains and run cargo
+ #
+ # Leave $CARGO_HOME blank so when this script is run in CI by the buildslave
+ # user, it uses the default /home/buildslave/.cargo directory which it has
+ # write permissions for - that allows it to download new crates during
+ # compilation
+ #
+ # The buildslave user does not have write permissions to the default
+ # $CARGO_HOME=/usr/local/cargo dir and so will error when trying to download
+ # new crates otherwise
+ #
+ # note: $PATH still contains /usr/local/cargo/bin at this point so cargo is
+ # still run via the root installation
+ #
+ # see https://github.com/rust-lang/rustup/issues/1085
+ set_hook_var "RUSTUP_HOME" "/usr/local/rustup"
+ fi
fi
# Call pre-build hook
call_hook pre_tf_build
- build_tf
+ if upon "$(get_tf_opt RUST)"; then
+ build_rfa
+ else
+ build_tf
+ fi
# Call post-build hook
call_hook post_tf_build
@@ -1718,13 +1793,12 @@
# Pre-archive hook
call_hook pre_tf_archive
- if upon "$(get_tf_opt RUST)"; then
- # for archiving into the Jenkins artifacts directory
- ln -fsr $tf_root/rust/target/bl31.{bin,elf} $tf_build_root
+ if not_upon "$(get_tf_opt RUST)"; then
+ from="$tf_build_root" to="$archive" collect_build_artefacts
+ else
+ from="$rfa_build_root" to="$archive" collect_rfa_artefacts
fi
- from="$tf_build_root" to="$archive" collect_build_artefacts
-
# Post-archive hook
call_hook post_tf_archive