You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We have a script developed by @Jiu9Shen that calculates a variety of statistics on a VCF File: https://gist.github.com/Jiu9Shen/1709484e7bf9564a27de6f2c221314b5. We manually paste the results of this script into the description of a given VCF file to give guidance to the researcher on how they many want to filter it.
Since this is such critical information and can be useful for any VCF file, we should add automatic calculation of these stats and generation of the table to this module. This would streamline the process, ensure these statistics are available for every file and provide this functionality to other Tripal sites using this module.
The text was updated successfully, but these errors were encountered:
The script for calculating these statistics can take some time depending on the size of the VCF file. As such it would be best run in a Tripal Job that is submitted when the VCF File is added or updated through the administrative interface. The results could then be stored in a generic VCF files metadata table as suggested for the ABH format in #2. The table on the VCF Filter form would then be built by querying this table.
We have a script developed by @Jiu9Shen that calculates a variety of statistics on a VCF File: https://gist.github.com/Jiu9Shen/1709484e7bf9564a27de6f2c221314b5. We manually paste the results of this script into the description of a given VCF file to give guidance to the researcher on how they many want to filter it.
Since this is such critical information and can be useful for any VCF file, we should add automatic calculation of these stats and generation of the table to this module. This would streamline the process, ensure these statistics are available for every file and provide this functionality to other Tripal sites using this module.
The text was updated successfully, but these errors were encountered: