latchbio-integration
Build bioinformatics pipelines with Latch SDK
Also available from: davila7
Deploy production-ready bioinformatics workflows without managing infrastructure. Create serverless pipelines using Python decorators with automatic containerization, GPU support, and integrated cloud storage.
Download the skill ZIP
Upload in Claude
Go to Settings → Capabilities → Skills → Upload skill
Toggle on and start using
Test it
Using "latchbio-integration". Create a Latch workflow for protein structure prediction
Expected outcome:
- Use @large_gpu_task decorator with nvidia-tesla-v100 GPU
- Import alphafold from latch.verified module
- Configure input via LatchFile for FASTA sequence
- Set output directory with LatchDir for PDB results
- Platform automatically handles Docker containerization
- Monitor execution via Latch dashboard
Using "latchbio-integration". How do I set up a DESeq2 differential expression analysis?
Expected outcome:
- Import deseq2 from latch.verified module
- Define input parameters for count matrix and sample metadata
- Configure output directory for results and plots
- Platform provisions appropriate compute resources
- Access results through registered output paths
Using "latchbio-integration". Configure GPU resources for AlphaFold on Latch
Expected outcome:
- Use @large_gpu_task decorator for GPU workloads
- Set gpu_type to nvidia-tesla-v100 or nvidia-tesla-a100
- Configure memory requirements based on protein size
- Platform handles GPU scheduling automatically
- Monitor GPU utilization during execution
Security Audit
SafeDocumentation-only skill containing only markdown files (.md) and JSON metadata. No executable source code present. All 285 static findings are false positives caused by the scanner misinterpreting markdown code block syntax and documentation examples as executable code. The 'backtick execution' findings are markdown code delimiters, 'weak cryptographic algorithm' findings are misinterpreted Python decorators, and 'credential access' findings demonstrate legitimate Latch SDK APIs. This is standard bioinformatics documentation teaching users how to use a cloud-based workflow platform.
Risk Factors
⚙️ External commands (6)
⚡ Contains scripts (2)
🌐 Network access (2)
🔑 Env variables (2)
Quality Score
What You Can Build
Deploy RNA-seq analysis pipelines
Build complete transcriptomics workflows from quality control through differential expression analysis.
Run protein structure prediction
Execute AlphaFold or ColabFold jobs with automatic GPU resource allocation and cloud storage.
Integrate verified bioinformatics tools
Combine pre-built workflows with custom steps for specialized analysis pipelines.
Try These Prompts
Create a Latch workflow that processes files using the @small_task decorator and returns a LatchFile result.
Configure a Latch task to use an A100 GPU for deep learning model execution with custom resource specifications.
Show how to register an existing Nextflow or Snakemake pipeline to the Latch platform.
Create a complete RNA-seq pipeline with quality control, alignment, and quantification tasks using Latch decorators.
Best Practices
- Start with standard task decorators (@small_task, @large_task) and scale resources only when profiling shows need
- Use type annotations and docstrings for all parameters to auto-generate workflow interfaces
- Test workflows locally with Docker before registering to the platform
Avoid
- Avoid over-provisioning resources - GPU tasks cost significantly more than CPU tasks
- Do not use dynamic resource configuration at runtime - decorators must be static
- Avoid mixing multiple workflow frameworks in a single pipeline without clear separation