TryHackMe-Intro to Cold System Forensics
A look into the concepts of cold system forensics and how DFIR teams examine offline systems.
Make every campaign better than the last with GetResponse Email Marketing + Landing Pages + Marketing Automation — $49 only
↓↓↓ Click here and earn $5 TryHackMe credit ↓↓↓
Task 1 Introduction
introduce users to cold system forensics, focusing on how DFIR teams examine offline systems for forensics. It will cover the differences, advantages, and disadvantages of live system forensics and the challenges that this type of forensics poses. It will also introduce data acquisition techniques and forensic tools and cover legal considerations for chain of custody and data volatility.
Objectives
- Learn the differences between live system forensics and cold system forensics.
- Explore the challenges and opportunities of examining offline systems.
- Learn the process and considerations of acquiring and preserving data from cold systems.
- A brief introduction to various tools used in the field and when they would be applicable.
Task 2 Challenges and Opportunities
Common Scenarios for Cold System Forensics
Cold system forensics is particularly applicable in various scenarios:
- Risk of modifying evidence: As live analysis can alter critical evidence, cold system forensics must ensure the evidence remains unaltered and admissible in court. This can be seen during forensic investigations where any interaction with the running system could overwrite volatile data or change file timestamps, compromising the integrity of the evidence.
- Comprehensive data capture: This is necessary when a thorough examination of all data is necessary. It is essential for deep investigations where every piece of information matters, such as financial fraud investigations. By shutting down the system and creating a bit-by-bit image of the storage device, analysts can capture all the data, including deleted files, and analyse the system’s complete state.
- Incident response: To preserve evidence from compromised systems without risk of alteration, shutting down a compromised server and performing cold analysis ensures the evidence remains intact. However, due to potential service disruptions, shutting down a compromised server in a large corporation running critical services might be avoided. Instead, forensic analysts might initially opt for live response techniques and later perform cold analysis on a cloned disk image to ensure no data is lost or modified.
- Legal proceedings: Cold system forensics ensures uncontaminated evidence that meets legal standards. Courts require a clear chain of custody and unaltered data, best achieved through cold analysis. When investigating intellectual property theft cases, demonstrating that the evidence was collected without any modification strengthens the case’s credibility and legal standing.
- Data recovery/File carving: This process retrieves deleted or lost files from a system. Cold analysis allows for reconstructing deleted files from disk images without overlooking any data during recovery.
- Legacy systems: Where live analysis is not feasible due to outdated or unstable systems, cold system forensics is handy. Live analysis tools may not be supported in environments with legacy systems, such as older industrial control systems in manufacturing plants. Therefore, cold analysis allows forensic experts to examine these systems without risking further instability or data loss.
- Cloud and virtualised environments: In today’s world, where cloud and virtualised environments are commonly used, cold system forensics can be used to analyse virtual machines without impacting running services. For example, a service provider might create snapshots of VMs suspected of being compromised, allowing analysts to analyse them and uncover any malice. This ensures that customer services remain uninterrupted while a thorough investigation is conducted.
Q: Under what two system states are cold system forensics mainly applied?
Ans: Dormant or Powered-off
Q: What type of attack provided a research basis for cold system forensics?
Ans: cold boot attack
Task 3 Data Acquisition and Preservation
Data acquisition and preservation are critical components of cold system forensics. This task will explore the methodologies and best practices for acquiring and preserving data, ensuring its integrity and chain of custody. This process is vital for maintaining the credibility of the data and ensuring its use in legal proceedings.
Order of Volatility
The order of volatility is an essential concept in digital forensics. It refers to the sequence in which data should be collected based on its volatility or likelihood to change. This order helps forensic analysts prioritise data acquisition, with the most ephemeral data being captured before it is lost or altered.
A typical order from the most volatile to the least volatile might look as follows:
- CPU registers and cache: These hold the most volatile data, typically lost once the host is powered down. When this data is captured, it can provide insights into current operations being executed.
- Routing Table, ARP Cache, Process Table, Kernel Statistics, and RAM: Data here changes rapidly, and capturing it may reveal information about running processes and network connections, which can help identify any malicious activity.
- Temporary File Systems: Data in temporary files is often cleared on reboot and thus changes frequently. The data can uncover recently accessed files or applications on a host.
- Hard Disk: Disk data is less volatile but can undergo alterations and deletions. Imaging the disk provides a comprehensive snapshot of all stored data, including deleted files and fragments.
- Remote Logging and Monitoring Data: These logs are relatively stable and less likely to change. They provide a record of network activity and system events over time.
- Physical Configuration and Network Topology: Documenting this data helps analysts understand the infrastructure and context of the investigation. Additionally, data transmitted over the network must be collected for forensics.
- Archival media: This data is stored offline, such as tapes and optical discs.
Q: The making of a bit-by-bit copy of forensic data is known as?
Ans: Disk imaging
Q: What restricts access to sensitive data?
Ans: Access control
Q: Which sources of evidence are part of the most volatile on a host?
Ans: CPU registers and cache
Task 4 Forensic Tools and Techniques
Every forensic investigation requires tools and techniques to produce reliable results. The following tools are essential to know about and add to your toolset. We will not go into the details of using every tool, but it is vital to know what each accomplishes.
Overview of Tools Used in Cold System Forensics
Disk Imaging Tools
- dd and dc3dd: These command-line utilities create exact bit-by-bit copies of hard drives. While dd is the foundational tool, dc3dd offers additional features such as progress indicators and error summaries, making it more suitable for forensic tasks.
- Guymager: This graphical disk imaging tool for Linux supports various disk image formats, provides a user-friendly interface, and maintains forensic integrity. It also provides write-blocking functionality and can generate checksums to verify data integrity.
- FTK Imager: This forensic imaging tool offers a comprehensive approach to disk imaging. It supports various media types, such as hard drives, CDs, DVDs, and USB drives. Some of its functionalities allow data previewing and provide multiple output formats.
Memory Forensic Tools
- Volatility: This robust framework is essential for extracting digital artefacts from volatile memory (RAM) dumps. Its usage is perfect for analysing memory dumps to uncover running processes, open network connections, and other volatile data that might be lost in a cold system analysis.
Disk Image Analysis Tools
- The Sleuth Kit (TSK): This is a collection of command-line tools for analysing disk images and recovering deleted data with granular control. TSK tools like
fls
andicat
are used to list files and directories, while other components provide advanced analysis capabilities. - Autopsy: Built on top of TSK, Autopsy provides a graphical interface for TSK functions, making it more accessible to non-technical users. It offers keyword searches, timeline analysis, and file carving features.
- EnCase: EnCase is a professional-grade forensic tool used for deep forensic investigations. It has extensive capabilities for disk imaging, data recovery, and analysis.
Techniques for Analysing Disk Images and Memory Dumps
Mounting and Exploring Disk Images
Mounting a disk image involves creating a virtual representation of the physical drive within the operating system. This allows forensic analysts to use standard file management tools and applications to interact with the image as though it were a physical drive. Key considerations include image format compatibility, write-blocking to protect the original data, and using virtual machines for isolated analysis.
Extracting Relevant Artefacts
This process involves identifying and extracting critical evidence from the disk image. Artefacts can include files, registry entries, database files, emails, web browser history, and system and application logs. As a forensic investigator, you must know techniques such as keyword searching, file signature analysis, registry analysis, email extraction and web history extraction.
Recovering Deleted Data
Deleted files leave traces on a disk. Forensic techniques such as file carving, unallocated space analysis and specialised data recovery tools can recover deleted information. Understanding file system structures is crucial for effective recovery.
Analysing Memory Dumps
Memory dumps capture the contents of a system’s RAM’s contents at a specific time. Analysing memory dumps provides insights into running processes, network connections, loaded modules, and other volatile data.
Associated Risks and Common Mistakes
- Data Integrity: Ensuring the integrity of the disk image is paramount. Using cryptographic hash functions before and after analysis while employing write-blocking prevents accidental modification.
- Misinterpretation of data: Forensic analysis requires careful interpretation of findings. Correlate evidence from multiple sources and consider the limitations of tools used in the process.
- Documentation: Maintain detailed and accurate documentation of all steps taken during the analysis process, including timestamps, tool versions, and a chain of custody for the collected evidence.
By following these guidelines, forensic investigators can effectively analyse cold systems, recover critical evidence and contribute to successful investigations.
Q: Using hash functions seeks to minimise risks associated with what element?
Ans: data integrity
Q: What is the name of the documentation responsible for listing the forensic evidence and its accompanying responsibilities?
Ans: chain of custody
Task 5 Practical
You are a digital forensics analyst at Swiftspend Finance, a prominent financial institution that has experienced a significant data breach. You’ve been tasked with using what you’ve learned to assist them in performing forensic tasks. Deploy the static site by clicking the green”View Site” button attached to this task. After completing the task, you will be given a flag to submit below.
Q: What flag do you receive after completing the Order of Volatility challenge?
Ans: THM{729a68a1253a5f4c7126110c0c600740}
Q: What flag do you receive after completing the Chain of Custody challenge?
Ans: THM{4de91692a4057c140d5a09875aba0431}
Task 6 Conclusion
Although this room introduces cold system forensics, the concepts covered apply to all areas of digital forensics. You can explore further by looking at the following rooms, which cover some in-depth concepts.
Furthermore, be on the lookout for the release of the following rooms within the Cold System Forensics module.
- Forensic Imaging
- RAM Images (Coming Soon!)
- Windows Images (Coming Soon!)