Commit ddb68576 authored by Kenzo-Hugo Hillion's avatar Kenzo-Hugo Hillion

Merge branch '15-split-fasta-chunks' into 'master'

Split a FASTA file in chunks

Closes #15

See merge request metagenomics/snakemake!2
parents e3b2c07b ec6df182
rule split_fasta:
"""
Split a FASTA file with the desired number of sequences per chunk
"""
input:
__split_fasta_input
output:
__split_fasta_output
params:
n_lines = __split_fasta_number_sequences * 2,
prefix = __split_fasta_prefix
shell:
"""
cat {input} | awk '/^>/ {{if(N>0) printf("\\n"); printf("%s\\n",$0);++N;next;}} {{ printf("%s",$0);}} END {{printf("\\n");}}' | split -l {params.n_lines} -a 5 -d - {params.prefix}
for i in `ls {params.prefix}*`; do mv $i ${{i}}.fa;done
"""
configfile: "config.yaml"
def count_sequences(fasta_file):
with open(fasta_file, 'r') as file:
seq = 0
for line in file:
if '>' in line:
seq += 1
return seq
# ==== Snakefile path ====
__split_fasta_rules = config.get("snakefiles", {}).get("split_fasta")
__main_output_dir = config.get('output_dir', 'output')
# ==== Split FASTA ====
__split_fasta_output_dir = __main_output_dir + "/split_fasta"
__split_fasta_input = config['input_fasta']
__split_fasta_number_sequences = config.get('split_fasta', {}).get('number_sequences', 1000000)
total_number_sequences = count_sequences(__split_fasta_input)
EXTENSIONS = [f"{i:05d}" for i in range(0, int(total_number_sequences/__split_fasta_number_sequences) + 1)]
__split_fasta_prefix = "/".join([__split_fasta_output_dir, config['split_fasta']['prefix']])
__split_fasta_output = expand(__split_fasta_prefix + "{ext}.fa", ext=EXTENSIONS)
include: __split_fasta_rules
rule all:
input: __split_fasta_output
snakefiles:
split_fasta: /pasteur/projets/policy01/Atm/snakemake/tools/utils/split_fasta/Snakefile
input_fasta: /pasteur/projets/policy01/DBs/IGC/2014-9.9M/IGC.fa
output_dir: /pasteur/projets/policy01/sandbox/20200210_test_snakemake/output
split_fasta:
prefix: IGC_
number_sequences: 1000000
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment