Skip to content

IMC23 Paper Artifacts

The paper is associated to the following types of artifacts:

Figshare material

The artifacts are stored in a figshare collection with the following items:

Downloading artifacts

Each artifact can be manually downloaded from the figshare collection. However, make sure to refer to the latest version of an archive when downloading manually.

tcbench offers automated procedures to fetch the right content from figshare:

  • For datasets please refer to datasets page page, the specific page for each datasets and the import command.

  • For the remaning, you can use the fetch-artifacts subcommand with the following process

  • First of all, prepare a python virtual environment, for example via conda

    conda create -n tcbench python=3.10 pip
    conda activate tcbench
    

  • Clone the tcbench repo and use the imc23 branch

    git clone https://github.com/tcbenchstack/tcbench.git tcbench.git
    cd tcbench.git
    git checkout imc23
    

  • Install tcbench

    python -m pip install .[dev]
    

  • Fetch the artifacts

    tcbench fetch-artifacts
    

This will install locally

  • The notebooks for replicating tables and figures of the paper under /notebooks/imc23. The cloned repository already contains the notebooks but since the code might change, the version fetched from figshare is identical to what used for the submission.

  • The ml-artifacts under /notebooks/imc23/campaigns.

  • The pytest resources for enabling unit tests.

Packages depencency version and /imc23 branch

When installing tcbench via pypi of from the main branch of the repository, only a few sensible packages have a pinned version.

If you are trying to replicate the results of the paper, please refer to the /imc23 branch which also contains a requirements-imc23.txt generated via pip freeze from the environment used for collecting results.

Based on our experience, the most probable cause of results inconsistency is due to package version.