15 Apr Behind the scenes of our security assessment
Inhaltsverzeichnis
- 1 Internal technology development
- 2 The Security Assessment
- 3 The environment analysis
- 4 You can do it that way, but…
- 5 The birth of our new wrapper
- 6 The strength of PowerShell: Comparison with Bash
- 7 Challenges in PowerShell development
- 8 Conclusion: Faster, more stable analyses for more efficient risk mitigation
Internal technology development
Today we would like to take a look behind the scenes of our security assessment, but what is it anyway? In a nutshell, we enable you to make informed decisions and plan and effectively implement your next steps on a solid, risk-aware basis. If you want to learn more about security assessment, here is the right place.
In this blog, however, the focus will be on the internal technical development of our environment analysis. We are very interested in receiving feedback and discussing our approach. Perhaps we can improve even further?
The Security Assessment
The content is divided into three main pillars, which Teal Consultants go through step by step with our customers.
-
- Firstly, we attach great importance to ensuring that you understand how attackers proceed and how you can protect yourself in theory. This is ensured by an intensive know-how transfer at the beginning of the assessment.
- It also includes a comprehensive analysis of your environment with currently 37 specific security checks. But technology is not everything! We get to know your working methods better with the help of our comprehensive questionnaire.
- A detailed final report, which includes a risk analysis and a recommendation with suitable measures based on this, completes our approach.
The environment analysis
We use various established tools and in-house developments to analyze your environment. Unlike traditional software providers of security tools, however, our tools do not remain in your environment, but must be usable without much installation and configuration effort and also disappear again after the analysis. As a result, we initially used simple PowerShell scripts to execute these tools one after the other. Each tool then stored the collected data individually. We then took this data with us for analysis.
You can do it that way, but…
This approach has led to numerous problems, including…
-
- …the script became longer and more confusing over time
- … depending on who extended the script, or the characteristics of a tool had an influence on the code that was used
- …it was difficult to define uniformly which tools were to be executed on which systems
- …standardized logging or troubleshooting was out of the question
All of this meant that we wanted to fundamentally reposition ourselves. That’s why we fundamentally redesigned and rebuilt our “wrapper” in an internal project.
The birth of our new wrapper
We are pleased to introduce our new wrapper, a testament to innovation and forward-thinking design. This tool is not just a product; it is a solution that aims to put simplicity, scalability and maintainability at the forefront of assessment analysis.
Core functions of the wrapper: simplification and security
Simplified reporting: Our wrapper simplifies the process of running reports on Active Directories and ensures that data collection is both efficient and comprehensive.
Customizable architecture: The modular design of the wrapper architecture guarantees flexibility and is suitable for a variety of Active Directory scenarios with ease.
Enhanced security with SMB shares: Security is a cornerstone of our wrapper, which utilizes SMB shares to keep all reporting results and dependencies secure, even in the event of crashes or lost connections.
PSSession Manager: A critical component, the PSSession Manager, orchestrates sessions with integrated logging, manages dependencies and consolidates results effectively.
The strength of PowerShell: Comparison with Bash
When I switched from Linux to Windows, I was initially skeptical about PowerShell. But it didn’t take long for me to appreciate its simplicity and versatility. The object-oriented nature of PowerShell made handling complex data structures a breeze.
As an example that underscores PowerShell’s ease of use, let’s list and count files in a directory. Let’s say we want to determine the number of text files in a particular folder.
PowerShell:
In PowerShell, this is possible with a single command:
$files = Get-ChildItem -Path “C:\MeineDokumente” -Filter “*.txt”
Write-Host “Anzahl der .txt-Dateien: $($files.Count)”
This command lists all .txt files in the “C:\MyDocuments” directory and displays their number. The output is clear and direct, without the need for additional processing steps.
Bash:
In comparison, Bash requires several commands to achieve the same result:
files=$(find /meine/dokumente -type f -name “*.txt”)
echo “Anzahl der .txt-Dateien: $(echo $files | wc -l)”
Here, “find” uses a combination of parameters to search for the files and “wc -l” counts them. This process is less intuitive and requires a deeper understanding of command line options.
This example shows how PowerShell simplifies file system management by providing powerful and easy-to-use cmdlets that eliminate the need for complex command chains common in traditional shells like Bash.
Challenges in PowerShell development
Error handling
PowerShell development offers many advantages, but it also confronts developers with challenges, especially in the area of error handling. Error messages in PowerShell are often cryptic and inconsistent, making troubleshooting a test of patience.
An example of this is a script that attempts to connect to an unreachable network path and PowerShell simply outputs:
“Error: network path not found”.
Without specific information, the developer is at a loss as to whether the problem lies with the network connection, the authorizations or some other aspect.
The documentation for PowerShell and .NET then becomes an indispensable guide to deciphering the meaning behind the error codes and finding solutions.
An improvement in error handling could be achieved through more precise error messages and better diagnostic tools that help developers to solve problems more efficiently.
The limits of object-oriented programming in PowerShell
The challenges that PowerShell presented in terms of object-oriented programming (OOP) forced us to rethink and adapt our development model several times. Although PowerShell effectively utilizes the OOP features of the .NET framework, we encountered limitations in creating classes and interfaces natively. These limitations often led us to resort to languages such as C# to achieve our goals. While these workarounds were effective, they were a compromise and far from the ideal, seamless development experience we were striving for.
Runspaces and OOP: overcoming a challenge
In PowerShell, runspaces are a fundamental concept for executing tasks in parallel, which is essential for improving performance and efficiency. However, the OOP limitations in PowerShell pose a unique challenge when it comes to preparing these runspaces, especially when using job runspaces.
The main problem stems from the fact that the native creation of interfaces or classes is not supported in the same way as with more traditional OOP languages. This means that developers often have to find workarounds to implement or work around OOP principles.
The GetPreparationScript method: A practical workaround
The approach involves using a method, GetPreparationScript, to return preparation scripts and values that can be passed into the runtime. This method is particularly useful for local jobs where no session can be prepared in advance. Here’s a closer look at the workaround:
function GetPreparationScript {
$code = {
param(
[string] $sharePath,
[string] $tempDir,
[string] $driveName,
[pscredential] $credential
)
$drivePath = $sharePath
# Weisen Sie den Modulpfad dem lokalen Verzeichnis zu, wenn die aktuelle Sitzung der lokale Rechner ist
if ($drivePath.StartsWith(“\\$env:COMPUTERNAME”)) { $drivePath = $tempDir }
New-PSDrive -Name $driveName -PSProvider FileSystem -Root $drivePath -Credential $credential
}
$params = @($this.sharePath, $this.tempDir, $this.driveName, $Global:shareCredential)
return $code, $params
}
This script block is designed to be returned with parameters, which makes it suitable for local jobs where no PSSession can be prepared. It cleverly checks whether the current session is on the local machine and reassigns the module path accordingly. It then creates a new PSDrive with the specified parameters.
Conclusion: Faster, more stable analyses for more efficient risk mitigation
With the new wrapper, we have succeeded in significantly improving the stability and speed of an environment analysis. Whereas it used to take hours to collect all the results in older versions, the new technology allows us to start the various scripts in parallel, saving us a significant amount of time.
Every customer environment is different and even today we occasionally have to troubleshoot and eliminate errors. However, this is much easier with standardized logging.
Thanks to standardization, it is now possible for any employee to define a new assessment/checkpoint and implement it quickly. The days when only 1-2 people with “coding” skills could expand the assessment are over. This helps us enormously to focus on the actual purpose, which is to expand the assessment with new checkpoints.
What was not intended at the beginning, but has established itself as a further added value, is the SW distribution in critical areas. In our AD tiering projects, T0 systems are defined at the beginning. Inevitably, the question arises: How can SW be deployed or installed on these systems? The established SW deployment tools are usually located in the T1 area and therefore cannot be used for this. Details on this topic in our blog series ESAE Deep Dive. However, the new wrapper not only allows assessments to be deployed and executed. Ultimately, the wrapper is a tool for copying sources or executing things on one or more remote systems. We are already using the wrapper for precisely this purpose with some customers.
To summarize, the changeover has not only improved our assessment, but has also given us another “asset” for our consulting activities.
LATEST POSTS
-
Successful participation at it-sa 2024 – focus on resilience through system hardening
It was a special premiere for TEAL: together with our partner FB Pro GmbH, we were not only represented there as an exhibitor for the first time, but were also able to offer real added value for the 40 or so participants with ...
20 November, 2024 -
Data security with tiering – protection at every level
In this article, we give you a closer look at the importance of Microsoft Tiering for your IT security. We have looked at the underlying issues and the critical areas and systems that need to be protected to prevent total loss ...
16 October, 2024 -
it-sa 2024: Visit us in Nuremberg!
This year we will be represented for the first time together with our partner FB Pro GmbH with a stand and a specialist lecture at one of the most important IT security trade fairs in Europe: it-sa 2024 in Nuremberg...
15 August, 2024