This course is in active development. Preview the scope below and create a free account to be notified the moment it goes live.
GBFA
The GBFA certification teaches digital forensic practitioners how to collect, image, and analyze evidence in the field, covering fundamentals, live triage, automation, and removable media handling under time constraints.
Who Should Take This
It is intended for incident responders, law‑enforcement officers, and forensic analysts who already understand operating‑system basics and command‑line tools. These professionals seek to sharpen field acquisition skills, reduce evidence‑collection latency, and ensure admissible, repeatable results during rapid investigations in high‑stakes environments.
What's Covered
1
Digital Evidence Fundamentals
2
Forensic Imaging and Disk Acquisition
3
Live System Triage
4
Field Triage Tools and Automation
5
Removable Media and External Device Forensics
6
Network-Based Evidence Collection
7
Acquisition Planning and Decision-Making
What's Included in AccelaStudy® AI
Course Outline
60 learning goals
1
Digital Evidence Fundamentals
2 topics
Evidence principles and legal foundations
- Describe the principles of digital evidence including Locard's exchange principle applied to digital environments, order of volatility (RFC 3227), best evidence rule, and the Daubert/Frye standards for scientific evidence admissibility in court proceedings.
- Identify the categories of digital evidence including volatile data (memory, running processes, network connections), semi-volatile data (temporary files, swap space), and non-volatile data (disk storage, removable media) and their collection priority.
- Describe chain-of-custody documentation requirements including evidence intake forms, transfer logs, storage conditions, access records, and hash verification at each custody transfer point to maintain admissibility in legal proceedings.
- Implement chain-of-custody tracking by creating evidence logs, documenting environmental conditions, recording photographic evidence of scene and media, and generating tamper-evident seals with cryptographic hash verification for each acquired item.
- Evaluate whether digital evidence collection procedures will withstand legal scrutiny by assessing documentation completeness, acquisition method defensibility, hash verification integrity, and adherence to organizational and jurisdictional requirements.
Cryptographic hashing and integrity verification
- Describe cryptographic hash functions (MD5, SHA-1, SHA-256) including their properties (deterministic, fixed-length output, avalanche effect), collision resistance characteristics, and role in forensic evidence integrity verification.
- Implement hash-based evidence verification using md5sum, sha256sum, and forensic tool built-in hashing to compute and compare hash values before and after acquisition, during transfer, and at analysis to demonstrate evidence has not been altered.
- Evaluate hash verification results including interpreting hash mismatches, determining whether differences indicate evidence tampering versus expected changes (bad sectors, imaging errors), and selecting appropriate hash algorithms for different evidence scenarios.
2
Forensic Imaging and Disk Acquisition
3 topics
Write blocking technology
- Describe hardware and software write blocker mechanisms including Tableau/UltraBlock hardware blockers, Windows registry write protection, Linux mount read-only options, and the forensic necessity of preventing write operations to source evidence media.
- Implement hardware write blocker deployment by connecting source media through write-blocking bridges, verifying write protection is active through read/write testing, and documenting write blocker model, firmware, and validation results.
- Implement software-based write protection on Linux forensic workstations using mount options (ro, noatime), udev rules, and blockdev read-only settings to prevent accidental modification of attached evidence media during examination.
- Assess write blocker effectiveness by evaluating device compatibility, interface support (SATA, USB, NVMe, SAS), maximum throughput limitations, and failure scenarios that could compromise write protection during forensic acquisition.
Forensic disk imaging tools and formats
- Describe forensic image formats including raw (dd), E01 (EnCase Evidence File), AFF4 (Advanced Forensic Format), and their characteristics regarding compression, metadata storage, segmentation, and integrity verification support.
- Implement forensic disk imaging using dd and dc3dd with appropriate block sizes, error handling (conv=noerror,sync), hash computation during acquisition, and progress monitoring for creating verified raw forensic images of hard drives and SSDs.
- Implement forensic imaging using FTK Imager to create E01 format images with case metadata, examiner information, evidence descriptions, and built-in hash verification for Windows-based acquisition workflows.
- Implement forensic imaging using Guymager on Linux forensic workstations with parallel hashing (MD5+SHA-256), compression settings, and split file output for creating verified E01 images with optimized throughput and integrity assurance.
- Compare forensic imaging approaches to select the optimal tool and format based on media type, available time, storage constraints, required compression, and downstream analysis tool compatibility for each acquisition scenario.
Storage technology considerations
- Describe SSD-specific forensic challenges including TRIM commands, wear leveling algorithms, garbage collection processes, and over-provisioned space that affect the availability and recoverability of deleted data on solid-state storage devices.
- Identify RAID configurations (RAID 0, 1, 5, 6, 10) and describe how each topology affects forensic acquisition strategy including stripe reconstruction, parity calculation, and the requirement to image individual member disks versus the logical volume.
- Implement encrypted disk acquisition by identifying BitLocker, FileVault, VeraCrypt, and LUKS encrypted volumes, extracting recovery keys when available, and performing decrypted imaging versus encrypted raw image acquisition based on key availability.
- Evaluate storage technology impact on evidence recovery by assessing TRIM state, SSD controller behavior, encryption key availability, and RAID topology to determine expected data recoverability before committing to time-intensive acquisition procedures.
3
Live System Triage
3 topics
Live response methodology
- Describe the decision framework for live system triage versus powered-off imaging including when volatile data collection justifies the forensic footprint of running tools on a live system and how to minimize evidence contamination.
- Implement live system volatile data collection using trusted tools from removable media to capture running processes, network connections, logged-in users, scheduled tasks, and system configuration without relying on potentially compromised system binaries.
- Implement memory acquisition from live Windows systems using WinPmem, DumpIt, or Magnet RAM Capture to create complete physical memory dumps before system shutdown for subsequent analysis using Volatility or Rekall frameworks.
- Assess live triage results to make rapid containment decisions including whether to isolate the system from the network, preserve power state for continued volatile collection, or proceed to full disk acquisition based on observed indicators.
Windows live triage
- Identify key Windows volatile data sources including process list (tasklist), network connections (netstat), open files (openfiles), logged-in users (query user), ARP cache, DNS cache, and routing table relevant to incident triage.
- Implement Windows live triage scripting using PowerShell to collect process details (Get-Process), network connections (Get-NetTCPConnection), services, scheduled tasks, autorun entries, and recent event log entries into structured output for rapid analysis.
- Implement Windows event log collection from live systems by extracting Security, System, Application, PowerShell, and Sysmon event logs using wevtutil or PowerShell (Get-WinEvent) for preservation before potential log rotation or adversary clearing.
- Analyze Windows live triage data to identify suspicious processes, unauthorized network connections, persistence mechanisms, and indicators of compromise to prioritize investigation actions and guide subsequent forensic acquisition decisions.
Linux and macOS live triage
- Identify key Linux volatile data sources including /proc filesystem, ps output, ss/netstat connections, lsof open files, crontab entries, systemd services, loaded kernel modules, and mount points relevant to incident triage on Linux servers.
- Implement Linux live response collection using shell scripts to gather process listings, network connections, user sessions, cron jobs, bash history, /etc/passwd modifications, and log files from /var/log for rapid triage of compromised systems.
- Implement macOS live triage to collect running processes, network connections, LaunchAgents/LaunchDaemons, kernel extensions, Unified Log entries, and Gatekeeper quarantine records for initial compromise assessment.
- Compare live triage approaches across Windows, Linux, and macOS to assess platform-specific volatile data availability, evidence collection tool compatibility, and operating system differences that affect field triage strategy selection.
4
Field Triage Tools and Automation
3 topics
KAPE artifact collection and processing
- Describe KAPE architecture including Targets (artifact collection definitions), Modules (processing/parsing definitions), the collection engine workflow, and output directory structure used for automated forensic triage and artifact extraction.
- Implement KAPE-based artifact collection by selecting appropriate Target configurations to gather registry hives, event logs, prefetch files, browser data, $MFT, and user profile artifacts from Windows systems for rapid forensic triage.
- Implement KAPE Module execution to process collected artifacts using EZTools (MFTECmd, PECmd, RECmd, EvtxECmd, AppCompatCacheParser) and generate parsed output ready for timeline analysis and artifact correlation.
- Configure custom KAPE Targets and Modules for organization-specific triage needs including custom application artifact collection, specialized log source gathering, and automated processing workflows tailored to the incident response playbook.
- Evaluate KAPE triage output quality by assessing artifact coverage completeness, identifying collection gaps for specific investigation hypotheses, and recommending additional targeted collection to fill evidentiary gaps discovered during initial analysis.
Velociraptor for field triage
- Describe Velociraptor architecture including the server-client model, VQL (Velociraptor Query Language) for artifact collection, built-in artifact library, and client deployment methods for rapid endpoint triage across enterprise environments.
- Implement Velociraptor single-server deployment for field triage including client binary generation, agent deployment to target endpoints, and execution of built-in triage artifacts (Windows.Triage, Windows.EventLogs, Generic.System.Pstree) for rapid evidence collection.
- Implement Velociraptor hunts to collect specific artifacts across multiple endpoints simultaneously, monitor collection progress, and export collected data for offline analysis using VQL queries tailored to investigation requirements.
- Compare KAPE and Velociraptor triage approaches to determine optimal tool selection based on network accessibility, number of endpoints, required artifacts, deployment constraints, and analysis workflow integration requirements.
Additional triage and collection tools
- Implement CyLR (CyberLive Response) collection for rapid live response artifact gathering across Windows, Linux, and macOS using pre-configured collection profiles that target high-value forensic artifacts without requiring full disk imaging.
- Implement forensic bootable media (SANS SIFT, Paladin, CAINE) deployment for field acquisition scenarios where live booting from trusted media is preferred over running tools on the target operating system.
- Implement automated triage scripting by creating reusable collection scripts that combine volatile data capture, targeted artifact extraction, and hash verification into single-execution workflows for consistent, repeatable field triage operations.
- Evaluate triage tool selection and deployment strategy by comparing collection speed, artifact coverage, platform compatibility, and evidence integrity assurance across CyLR, KAPE, Velociraptor, and forensic boot media for field operation readiness.
5
Removable Media and External Device Forensics
2 topics
USB and removable media acquisition
- Describe USB device forensic considerations including FAT/exFAT/NTFS file system structures on removable media, USB device identification (VID/PID), connection tracking artifacts on host systems, and the order of volatility for USB evidence.
- Implement USB device history reconstruction from host system artifacts including Windows setupapi.dev.log, SYSTEM registry (USBSTOR, USB keys), mountpoints2 registry entries, and event logs to determine which USB devices connected and when.
- Implement removable media imaging for USB drives, SD cards, and external hard drives using write blockers and appropriate imaging tools with special consideration for flash media controller behavior and potential hidden partitions.
- Analyze USB device connection artifacts across multiple host systems to establish a timeline of device usage, identify data transfer patterns, and correlate device connections with user account activity and suspected data theft scenarios.
File recovery and carving from removable media
- Describe file carving principles including header/footer signature identification, file structure-based carving, and the limitations of carving fragmented files, overwritten data, and files from TRIM-enabled flash media.
- Implement file recovery from removable media using foremost, scalpel, and PhotoRec to carve deleted files from unallocated space, and use file system metadata analysis to recover files from deleted directory entries on FAT and NTFS volumes.
- Evaluate file recovery results by assessing carving completeness, identifying fragmented or partial file recoveries, validating recovered file integrity through hash comparison and content verification, and documenting recovery limitations.
6
Network-Based Evidence Collection
1 topic
Remote acquisition and network evidence
- Describe remote forensic acquisition methods including network-based imaging (F-Response, ewfacquire over iSCSI), remote agent-based collection, and cloud storage acquisition that enable evidence gathering without physical media access.
- Implement remote triage collection using Velociraptor, WinRM, or SSH-based scripts to collect forensic artifacts from systems that cannot be physically accessed, ensuring secure transfer and integrity verification of collected evidence.
- Assess remote acquisition limitations including network bandwidth constraints, evidence integrity concerns over untrusted networks, partial collection risks, and the tradeoff between remote triage speed and full-image forensic completeness.
7
Acquisition Planning and Decision-Making
2 topics
Field acquisition strategy
- Implement a field acquisition decision matrix that guides the selection of imaging versus triage, live versus dead acquisition, and tool selection based on available time, media type, encryption state, and investigation objectives.
- Evaluate field acquisition completeness across all collected systems and media to identify evidence gaps, assess whether acquired evidence supports investigation hypotheses, and recommend follow-up acquisitions for critical missing data sources.
Documentation and reporting for field operations
- Implement field acquisition documentation by recording scene observations, evidence item descriptions, acquisition tool output, hash values, timestamps, and anomalies encountered during collection in a structured evidence intake report.
- Assess documentation quality by reviewing evidence logs for completeness, verifying all hash values are recorded, confirming chain-of-custody continuity, and identifying documentation gaps that could compromise evidence admissibility.
Scope
Included Topics
- All domains covered by the GIAC Battlefield Forensics and Acquisition (GBFA) certification aligned with SANS FOR498: Battlefield Forensics and Data Acquisition.
- Digital evidence acquisition principles including order of volatility, forensic soundness, write protection, and evidence integrity verification through cryptographic hashing (MD5, SHA-1, SHA-256).
- Forensic imaging tools and techniques using dd, dc3dd, FTK Imager, Guymager, and ewfacquire for creating bit-for-bit disk images in raw (dd), E01 (EnCase), and AFF4 formats with integrity verification.
- Write blocker technology including hardware write blockers (Tableau/UltraBlock), software write blockers (Windows registry-based, Linux mount options), and validation procedures to ensure forensic acquisition does not alter source media.
- Live system triage and volatile data collection using KAPE (Kroll Artifact Parser and Extractor), Velociraptor, and custom PowerShell/Bash scripts for rapid evidence gathering from running systems during active incidents.
- Removable media forensics including USB drive analysis, SD card recovery, external hard drive imaging, optical media preservation, and SSD-specific considerations (TRIM, wear leveling, garbage collection) that affect evidence recovery.
- Chain of custody documentation, evidence handling procedures, legal considerations for digital evidence admissibility, and field triage best practices for time-constrained forensic acquisition scenarios.
- Field triage tools and automation including KAPE target and module configurations, Velociraptor client deployment for rapid triage, and CyLR for scalable live response collection across multiple endpoints.
Not Covered
- Advanced malware analysis, reverse engineering, and memory forensics at the depth covered by GCFA or GREM certifications.
- Network forensics, packet capture analysis, and traffic pattern investigation covered by GNFA certification.
- Mobile device forensics and smartphone acquisition techniques covered by GASF certification.
- Advanced Windows artifact interpretation beyond what is needed for initial field triage and acquisition decision-making.
- Courtroom testimony techniques and expert witness preparation beyond basic chain-of-custody and documentation requirements.
Official Exam Page
Learn more at GIAC Certifications
GBFA is coming soon
Adaptive learning that maps your knowledge and closes your gaps.
Create Free Account to Be Notified