Joint Artifact Evaluation Track and ROSE Festival - Call for Submissions
(Virtual Event)
The ICSME 2021 Joint Artifact Evaluation Track and ROSE Festival is a track that aims to celebrate open science in Software Engineering research.
The Artifact Evaluation Track will assess artifacts submitted by the authors of papers accepted for publication to the ICSME, SCAM, and VISSOFT technical research tracks and award badges that will be displayed in those papers based on their contributions to open science.
The ROSE Festival (Recognizing and Rewarding Open Science in SE), a special track within ICSME, is a venue where researchers can receive public credit for facilitating and participating in open science in Software Engineering. For ICSME 2021, the ROSE Festival will (a) host lightning talks about the accepted artifacts, and (b) present peer-reviewed short position papers about open science in Software Engineering research.
Artifact Evaluation Track (ICSME, SCAM, VISSOFT). Authors of papers accepted to ICSME, SCAM, and VISSOFT 2021 are invited to submit artifacts associated with those papers to the ICSME Artifact Evaluation Track for evaluation as candidate available, reusable, reproduced, or replicated artifacts. Papers with artifacts that meet our review criteria will be awarded badges, noting their contributions to open science in Software Engineering. If an artifact is accepted, authors will be invited to give a lightning talk on the artifact at the ROSE Festival Track held during ICSME 2021, and we will work with IEEE to add badges to the electronic versions of the authors' paper(s), corresponding to the Available and Reusable categories. For more information, see the call for submissions below.
Position Papers (ROSE Festival). We invite authors to submit short position papers of up to four pages (plus two pages for references) about open science, including replication, reproducibility and artifact evaluation, in Software Engineering research. Authors will be invited to give a lightning talk on the position paper at ICSME 2021. For more information, see the CfP below.
Call for Position Papers
Important Dates
**All submission dates are at 23:59 AoE (Anywhere on Earth, UTC-12)**
- Paper Submission: Tuesday, June 29, 2021
- Author Notification: Saturday, July 17, 2021
- Camera-Ready: Sunday August 15, 2021
Goal and Scope
Researchers and practitioners are invited to submit position papers (maximum of four pages + two pages for references) about trends and issues of importance regarding open science in software engineering research, particularly with regard to the set of topics within the scope of the ICSME community. We will consider a broad range of position papers that includes analyses and retrospectives on how the community has embraced or rejected open science principles. Particular topics of interest include:
- Best practices for reproduction and replication studies
- Best practices for artifact preparation and evaluation
- Impact of open science practices on industrial adoption of research
- How to promote open science principles in research community
- Experiences with replication and reproduction (including negative experiences)
These papers should fall within the scope of topics covered by ICSME, VISSOFT, and SCAM, and be related (in a broad sense) to one or more of those topics:
- Change and defect management
- Code cloning and provenance
- Concept and feature location
- Continuous integration/deployment
- Empirical studies of software maintenance and evolution
- Evolution of non-code artefacts
- Human aspects of software maintenance and evolution
- Maintenance and evolution of model-based methods
- Maintenance and evolution processes
- Maintenance and evolution of mobile apps
- Maintenance versus release process
- Mining software repositories
- Productivity of software engineers
- Release engineering
- Reverse engineering and re-engineering
- Run-time evolution and dynamic configuration
- Service oriented and cloud computing
- Software and system comprehension
- Software migration and renovation
- Software quality assessment
- Software refactoring and restructuring
- Software testing theory and practice
- Source code analysis and manipulation
While many open science principles and challenges are universal within the broad software engineering community, the papers published in this track should be targeted towards the ICSME/VISSOFT/SCAM community. At minimum, concrete examples should target one or more of the topics listed above.
Evaluation
Each submitted paper will be reviewed by at least three three members of the ROSE Festival program committee. Contributions must, in a highly convincing manner, clearly articulate their vision, novelty, and potential impact. Submissions will be evaluated on the basis of soundness, importance of contribution, originality, quality of presentation, and potential to inspire discussion. Submissions that are not in compliance with the required submission format, or that are out of the scope of the conference, will be desk-rejected without being reviewed. Submitted papers must comply with IEEE plagiarism policy and procedures. Papers submitted must not have been published elsewhere and must not be under review or submitted for review elsewhere while under consideration. Submitting the same paper to different tracks of ICSME/VISSOFT/SCAM 2021 is also not allowed.
Publication and Presentation
Accepted papers will be published in the conference proceedings and submitted for inclusion in the IEEE Xplore Digital Library. All authors of all accepted papers will be asked to complete an electronic IEEE Copyright form and will receive further instructions for preparing their camera-ready versions. At least one author of each accepted paper must register for the conference and present the paper at the conference. Failure of at least one author to register by the early registration date will result in the paper being withdrawn from the conference proceedings. IEEE reserves the right to exclude a paper from distribution after the conference (e.g., by not placing it in the IEEE Xplore Digital Library) if the paper is not presented at the conference. Presentation details will follow notifications of acceptance.
How to submit
We follow a double-blind reviewing process. Submitted papers must adhere to the following rules:
- Author names and affiliations must be omitted. (The track co-chairs will check compliance before reviewing begins.)
- References to authors' own related work must be in the third person. (For example, not "We build on our previous work..." but rather "We build on the work of [name]...")
Please see the Double-Blind Reviewing FAQ for more information and guidance.
Papers must strictly adhere to the two-column IEEE conference proceedings format. Please use the templates available here. LaTeX users should use the following configuration: \documentclass[conference]{IEEEtran}. Microsoft Word users should use the US Letter format template. Papers must not exceed 4 pages (including figures and appendices) plus up to 1 page that contains ONLY references. All submissions must be in PDF and must be submitted online by the deadline via the ICSME 2021 EasyChair conference management system (“ROSE Festival Position Papers” Track). Any relevant supplemental material should also be anonymized and submitted by the same deadline through EasyChair. All authors, reviewers, and organizers are expected to uphold the IEEE Code of Conduct.
ICSME supports and encourages Green Open Access (also called self-archiving). We encourage authors to self-archive a preprint of your accepted manuscript in an e-print server such as arXiv.org. Open access increases the availability of your work and increases citation impact. To learn more about open access, please read the Green Open Access FAQ by Arie van Deursen. Note that if your research includes scraped GitHub data, the GitHub Terms of Service require that “publications resulting from that research are open access”. If possible, we recommend that you archive your paper (e.g., on arXiv or on your website) only after the ICSME reviewing process is completed, to avoid undermining the double-blind reviewing process in place.
Call for Artifact Submissions
We invite authors of papers of all lengths accepted to the ICSME, SCAM, and VISSOFT 2021 Technical Tracks to submit artifacts associated with those papers to the ICSME Artifact Evaluation Papers with artifacts that meet our review criteria will be awarded badges, noting their contributions to open science in SE. If an artifact is accepted, authors will be invited to give a lightning talk on the artifact during the ROSE festival at ICSME 2021, and we will work with IEEE to add badges corresponding to the Available, Reusable, Reproduced, and Replicated categories defined in the table below to the electronic versions of the paper(s). Artifacts of interest include (but are not limited to) the following:
- Software, which are implementations of systems or algorithms potentially useful in other studies.
- Automated experiments that replicate the study in the accepted paper.
- Data repositories, which are data (e.g., logging data, system traces, survey raw data) that can be used for multiple software engineering approaches.
- Frameworks, which are tools and services illustrating new approaches to software engineering that could be used by other researchers in different contexts.
- Qualitative artifacts such as interview scripts and survey templates.
This list is not exhaustive, so the authors are asked to email the chairs before submitting if their proposed artifact is not on this list. For additional types of artifacts please see here.
Important Dates
ICSME, VISSOFT, SCAM all tracks deadlines:
- Artifact Submission: Friday August 27th, 2021
- Author Notification: Friday September 17th, 2021
Evaluation Criteria
Artifacts Evaluated | Results Validated (Note: These badges are awarded to the study being replicated or reproduced) |
|||
---|---|---|---|---|
Available For ICSME’21, SCAM’21, and VISSOFT’21 Tracks |
Functional | Reusable For ICSME’21, SCAM’21, and VISSOFT’21 Tracks |
Reproduced For ICSME’21, SCAM’21, and VISSOFT’21 Tracks |
Replicated For ICSME’21, SCAM’21, and VISSOFT’21 Tracks |
Open Research Objects (ORO) |
No badge |
Research Objects Reviewed (ROR) |
Results Reproduced (ROR-R) |
Results Replicated (RER) |
Placed on a publicly accessible archival repository. A DOI or link to this persistent repository along with a unique identifier for the object is provided. Artifacts have not been formally evaluated. | Artifacts documented, consistent, complete, exercisable, and include appropriate evidence of verification and validation. | Functional + very carefully documented and well-structured to the extent that reuse and repurposing is facilitated. In particular, norms and standards of the research community for artifacts of this type are strictly adhered to. | The artifacts provided by the original authors are Functional + the main results of the paper have been obtained in a subsequent study by a person or team other than the authors, using, in part, artifacts provided by the original authors. | The main results of the paper have been independently obtained in a subsequent study by a person or team other than the authors, without the use of author-supplied artifacts. |
The ICSME artifact evaluation track uses a single-blind review process. Artifacts will be evaluated using 1) the criteria summarized in the last row of the table above, 2) the quality of the documentation produced by the authors as described in Section 3 - DOCUMENTING THE ARTIFACT, and 3) the process described in Section 4 - REVIEWING THE ARTIFACT.
The goal of this track is to encourage reusable research products. Hence, no functional badges will be awarded.
Best Artifact Award
There will be a Best Artifact Award for each venue (ICSME, VISSOFT, SCAM) to recognize the effort of authors creating and sharing outstanding research artifacts.
Submission and Review
Note that all submissions, reviewing, and notifications for this track will be via the ICSME 2021 EasyChair conference management system (“Artifact Evaluation” Track).
- There will be no emails notifying authors, acknowledging a submission, or informing authors of the results of the review process.
- During the first week after the submissions, kick-the-tires period, the authors might be asked to provide clarifications if the reviewers have problems with the artifacts.
- After the kick-the-tires period, the reviewing process can start.
- Authors will be notified of final decisions on the notification date.
Authors of the papers accepted to the tracks must perform the following steps to submit an artifact for the Available (ORO) and Reusable (ROR) badges:
- Prepare the artifact
- Make the artifact available
- Document the artifact
- Declare conflict of interest
- Submit the artifact
See Document the Artifact below for more details on how to submit for Reproduced (ROR-R) and Replicated (RER).
Preparing the Artifact
There are two options depending on the nature of the artifacts: Installation Package or Simple Package. In both cases, the configuration and installation for the artifact should take less than 30 minutes. Otherwise the artifact is unlikely to be endorsed simply because the committee will not have sufficient time to evaluate it.
- _Installation Package_s: If the artifact consists of a tool or software system, then the authors need to prepare an installation package so that the tool can be installed and run in the evaluator’s environment. Provide enough associated instruction, code, and data such that some CS person with a reasonable knowledge of scripting, build tools, etc. could install, build, and run the code. If the artifact contains or requires the use of a special tool or any other non-trivial piece of software the authors must provide a VirtualBox VM image or a Docker container image with a working environment containing the artifact and all the necessary tools. Similarly, if the artifact requires specific hardware, it should be clearly documented in the requirements (see Section 3 - DOCUMENTING THE ARTIFACT). Note that we expect that the artifacts will have been vetted on a clean machine before submission.
- Simple Package: If the artifact contains only documents that can be used with a simple text editor, a PDF viewer, or some other common tool (e.g., a spreadsheet program in its basic configuration) the authors can just save all documents in a single package file (zip or tar.gz).
Making the Artifact Available
Authors need to make the packaged artifact (installation package or simple package) available so that the Evaluation Committee can access it. We suggest a link to a public repository or to a single archive file in a widely available archive format. If the authors are aiming for the available badge, the artifact needs to be publicly accessible. Note that links to individual websites or links to temporary drives (e.g. Google) are non-persistent and thus artifacts placed in such locations will not be considered for the available badge. Examples of persistent storages that offer DOI are Zenodo, figshare, and Open Science Framework. Other suitable providers can be found here. For larger files like VirtualBox images, we recommend the use of such open repositories. Institutional repositories are acceptable. In all cases, repositories used to archive data should have a declared plan to enable permanent accessibility.
In other cases, the artifacts do not necessarily have to be publicly accessible for the review process. In this case, the authors are asked to provide a private link or a password-protected link.
Documenting the Artifact
Authors need to write and submit documentation explaining how to obtain the artifact package, how to unpack the artifact, how to get started, and how to use the artifacts in sufficient detail. The artifact submission must describe only the technicalities of the artifacts and uses of the artifact that are not already described in the paper. The submission should contain the following documents (in markdown plain text format within the submission root folder):
- A README.md main file describing what the artifact does and how and where it can be obtained (with hidden links and access password if necessary). Also, there should be a clear description, step-by-step, of how to reproduce the results presented in the paper.
- A LICENSE.md file describing the distribution rights. Note that to score "available", then that license needs to be some form of open source license.
- A REQUIREMENTS.md file describing all necessary software/hardware prerequisites.
- An INSTALL.md file with installation instructions. These instructions should include notes illustrating a very basic usage example or a method to test the installation. This could be, for instance, information on what output to expect that confirms that the code is installed and working; and that the code is doing something interesting and useful.
- A copy of the accepted paper in pdf format.
For Reproduced (ROR-R) submissions:
- The original authors must have made artifacts available for use in the reproduction. The artifact must meet the criteria for the Available (ORO) badge. You do not need to document the artifact.
- Submissions for the Reproduced (ROR-R) badge must fall within the scope of topics covered by ICSME, SCAM, and VISSOFT.
- If the submission is accepted, we will attempt to recommend that the original paper receives a Reproduced (ROR-R) badge, and if not already applied, an Available (ORO) badge.
- You must submit a README.md file in markdown plain text format, containing the following details:
- A link to the reproduction study (e.g., a DOI link to a publisher site or to a pre-print).
- A link to the original study.
- A link to the artifacts used in the replication.
- Explanation of whether the study is a partial or complete reproduction.
- Explanation of what was reproduced, and the results of the reproduction.
For Replicated (RER) submissions:
- Submissions for the Replicated (RER) badge must fall within the scope of topics covered by ICSME, SCAM, and VISSOFT.
- If the submission is accepted, we will attempt to recommend that the original paper receives a Replicated (RER) badge.
- If you have made new artifacts available as part of your reproduction, you may apply for the Available (ORO) and Reusable (ROR) badges only if your replication is an accepted submission to ICSME, SCAM, or VISSOFT 2021.
- You must submit a README.md file in markdown plain text format, containing the following details:
- A link to the replication study (e.g., a DOI link to a publisher site or to a pre-print).
- A link to the original study.
- Explanation of whether the study is a partial or complete replication.
- Explanation of what was replicated, and the results of the replication.
Reviewing the Artifact
The review process will be interactive as follows:
- Kick-the-tires: Before the actual evaluation, reviewers will check the integrity of the artifact and look for any possible setup problems that may prevent it from being properly evaluated (e.g., corrupted or missing files, VM won’t start, immediate crashes on the simplest example, etc.). The Evaluation Committee may contact the authors to request clarifications on the basic installation and start-up procedures or to resolve simple installation problems. Authors are required to provide responses promptly (within 48h) to help reviewers resolve any technical issues with their artifact submission. This phase will be 1 week long (this is a hard limit) and the authors will be given 2 chances to clarify/fix the problems within this period. If the problems remain after the interactions, the submission will be rejected. The number of interactions is limited to maintain the workload for the reviewers at a reasonable level.
- Artifact assessment Reviewers evaluate the artifacts and provide comments via EasyChair.
- Notification Authors are informed of the outcome.
Additional information for AUTHORS:
- Make sure that a link to the artifact is in the paper submission and it remains there for the CR version. The process for awarding badges is conducted after the CR deadline.
- Download the artifact that you have submitted on a clean install and forward your own instructions to limit the potential problems that the reviewers might encounter. Include at the end of the INSTALL.md the configuration for which the installation was tested.
- For software artifacts, consider preparing a virtual image in addition to your self-contained artifact: this greatly reduces possible problems on the reviewers’ side, and will benefit future users of the artifact. Some options include: Docker, VirtualBox, Vagrant, Packer. Non-software artifacts (e.g. datasets) should be distributed as a single archive (no need for a VM).
- There will be a maximum of 2 interactions with the reviewer during the kick-the-tires phase. Please make sure that your answers are clear and provide as much detail as possible to resolve the problems that the reviewers are having.
- Reviewers should not need to figure out on their own what the input is for a specific step or what output is produced (and where). All usage instructions should be explicitly documented in the step-by-step instructions of the README.md file.
- Provide an explicit mapping between the results and claims reported in the paper and the steps listed in the README.md for an easy traceability.
- Place any additional information that does not fit the required type of information in a separate document (ADDITIONAL_INFORMATION.md) that you think might be useful.
Additional information for REVIEWERS:
- We adopt the following definitions:
- Documented: At minimum, an inventory of artifacts is included, and sufficient description provided to enable the artifacts to be exercised.
- Consistent: The artifacts are relevant to the associated paper, and contribute in some inherent way to the generation of its main results.
- Complete: To the extent possible, all components relevant to the paper in question are included. (Proprietary artifacts need not be included. If they are required to exercise the package then this should be documented, along with instructions on how to obtain them. Proxies for proprietary data should be included so as to demonstrate the analysis.)
- Exercisable: Included scripts and/or software used to generate the results in the associated paper can be successfully executed, and included data can be accessed and appropriately manipulated.
- Your goal is to gain sufficient confidence that the results of the paper can be obtained by relying on the submitted artifact. You are free to employ any strategy that allows you to do so such as inspecting the code, rebuilding it from scratch, changing the code/data, etc. However, none of those are mandatory.
- We do not expect an exhaustive inspection of the original paper results. However, we do recommend a partial inspection or, at minimum, a spot-check of the produced data sufficient to gain confidence in use of the artifact to produce the published research results.
- Some artifacts are difficult to run and might require additional hardware/software resources. Those will be evaluated on a case-by-case basis as typical strategies for evaluation might not be applicable.
- For available badges, artifacts do not need to have been formally evaluated in order for an article to receive this badge. In addition, they need not be complete in the sense described above. They simply need to be relevant to the study and add value beyond the text in the article. Such artifacts could be something as simple as the data from which the figures are drawn, or as complex as a complete software system under study.
- For reusable badges, the artifacts associated with the paper are of a quality that significantly exceeds minimal functionality. That is, they have all the qualities of the Functional level, but, in addition, they are very carefully documented and well-structured to the extent that reuse and repurposing is facilitated. In particular, norms and standards of the research community for artifacts of this type are strictly adhered to.
- For replicated badges, there must be enough evidence that the results of the paper have been produced independently, i.e., without the use of author-supplied artifacts.
- For reproducible badges, it must be clear what was supplied by the authors of the original work and there must be a link to that material.
- For replicated and reproducible badges, exact replication or reproduction of results is not required, or even expected. Instead, the results must be in agreement to within a tolerance deemed acceptable for experiments of the given type. In particular, differences in the results should not change the main claims made in the paper.