Blog

Analysis of ISE's Livestream Exploit of a NAS Device

September 10, 2018

The Independent Security Evaluators are a fantastic group of people, who have started to post live stream videos of themselves exploiting devices. I believe that exploiting a device is trivial, once the vulnerability is known. But, finding that bug is extremely difficult. Looking from the outside, it is hard to know where to even start! So, I really appreciate a detailed walk-through to understand the thought-process from start to finish. The particular video that I am analyzing can be found here , done by Joshua Meyer. of ISE.

Terminology

Obviously, if you are going to exploit a NAS (Network Attached Storage) you better know what NAS is. Same goes for some other terms and technologies. So, I thought that defining some terms would be a good starting point!

Encryption

The art of encryption is "the process of converting information or data into a code, especially to prevent unauthorized access." Encryption is in our everyday lives; it protects our livelihood from attackers constantly.

There are two main types of encryption: symmetric and asymmetric. Symmetric is really fast for encryption and decryption; it has been around since the days of Cesar, with the classic cipher being the Cesar Cipher. But, this has the issue of getting the key of the encryption to the other person. Asymmetric encryption is really slow, but can solve this problem of needing to have a shared key! There is a public key, which everyone knows, to encrypt the information. Then, there is a private key, that is used to decrypt the information, that only the receiver knows. The only algorithm we need to know for this video is AES(Advanced Encryption Standard). But, the Cesar Cipher is a good reference to understand what is going on.

Disassemble

By the literal definition, this means to take something apart. However, with penetration testing, this means to take the binary or executable file (which is just 0's and 1's) then convert it into a human readable file, such as source code. Even though going from binary to C++ or Python is not possible, in most cases, going to Assembly is! With tools like IDA(which was discussed in my last blog post about another ISE exploit) or Binary Ninja, can take a binary, convert it into Assembly, then add some other features that make it possible for a human to follow, such as branching guides.

Nmap

Nmap is a network analysis tool, that is extremely good at viewing which ports are open and what services that they are running. I have a guide for a brief workshop that I did on Nmap at https://github.com/mdulin2/nmap_writeup

URL Encode

When processing a network request, there are only so many characters that it can handle. An encoding on each request is made, with a %#, where the '#' is a hexadecimal number. When running penetration tests, it is crucial follow all of the standard protocols, to ensure things run how they are supossed to.

Other

Some of the tools/terms being used I have defined in my previous blog post, such as Burp Suite, IDA, Telnet, Shell Scripting and a few other things.

Humble Beginnings

Goals

What is the goal of any penetration test? Well, let's think like a hacker here. The goal of a hacker is to have as much of a foothold on the system as possible. So, looking for command injections, unauthenticated requests and privilege escalation are all great ways to have as much of a foothold on the router as possible. Before watching this video, I did not understand what to look for during a penetration test in order to compromise the system. Now, though, this seems to be more clear to me.

Information Gathering

Playing around with the device to understand how the device works is very crucial. Josh messes around with a few of the NAS's interfaces before proceeding much further. Events, such as adding users, adding files and a few other things were being tested. From this, he figured out, through viewing Burps view of web requests, that PHP was being used. PHP is known for having remote code execution vulnerabilities everywhere, just because of the nature of the language. So, this was a great thing to see. Further, he figured out that input validation was not being done very well. Within the first few minutes, when adding a user, a XSS(Cross Site Scripting) vulnerability and command injection were both possible. Again, understanding the attack vector is immensely important! So, what to do next?

Base Knowledge

Having a great base knowledge of how these types of systems work cannot be understated! I felt that the XSS that Josh found was a reasonable thing to test. However, the command injection was not, to me at least. Josh knew that when a user was being added that the useradd command on the Linux OS was likely being used. From here, he could craft a special username to create a file or do anything else he wanted to on the system. Having a great base knowledge of how a system works is immensely important to a pen-test. Further, Mr. Meyer knew the workflow of how PHP files had to be executed and quite a bit about encryption. Without this knowledge, he would not have been able to construct this exploit.

Reading The Source Code

Most of the time this is quite trivial, but this video has incredible maneuvers to simply read the source code of the project. So, it is definitely worth discussing. The reason Josh really wanted to read the source code, was because of how bad the input validation was. This, combined with PHP being very heavy on running OS commands directly, is a very deadly combination.

Discovery Of PHP Files

Meyer wanted to view the PHP source code to search for functions like 'exec', 'system', 'shell_exec' among others. But, when he went to view the .PHP files from the network requests that he had made, it was a garbled mess! He ran the file command on the PHP file to see that it was a binary data file. How could this be, it is labeled as a PHP file but its just a mess?

Thought Process After Seeing The Files

After seeing this, Mr. Meyer knew that the files must be encrypted! Although, this may seem obvious to someone reading it, the thought did not go into my head when the files were just a garbled mess. The files are encrypted? How does the PHP interpreter deal with this? There is no way it can handle this, is what went through my mind. At this point, Meyer knew that the PHP interpreter had to be decrypting the files at execution time, for them to work. So, where is the key?

Getting the Key

First of all, Meyer had an incredible amount of knowledge to figure out this. I would really like to tip my cap to the man for deciphering this puzzle. The first thing he did was disassemble the PHP interpreter into assembly with binary ninja.

Now that he has the code for the system, where is a good place to start looking? This interpreter must be gigantic...Again, having a overall knowledge of the cyber security world has helped us again! Josh made two key assumptions that enabled him to find the key so quickly: 1) Assuming the key must be a symmetric key algorithm of some kind. I assume he made this call because the interpreter must decrypt the source code really quickly for the file. From this, he deduced that 2) the AES encryption algorithm is probably being used, by the sheer amount of popularity and usage of it as a symmetric encryption algorithm. These two claims made finding the key a simple string search of "AES" away from finding the key.

After reading a little Assembly, he figured out that AES is used to encrypt the files. Further, this value is then hashed with the MD5 hashing algorithm, to add a little bit of indirection. So, now, we have the key!

Decrypting The Source Code

Now, with the key, getting the source code should be a breeze. He wrote a bash script that took the md5 hash of the key. Then, piped this to a few parsers to get ride of annoying things, such as dashes. After this, he used OpenSSL to decrypt the file. Even though some parts of the file were not decrypted properly, the bulk of the file was readable!

Entering the Exploitation Zone!

Hmm, So Many Holes

Mr. Meyer then searched for 'exec' to find hundreds of examples of its usage. Obviously, with very little input validation, command injections were going to be everywhere inside this. However, let's think like an attacker again...Will I have credentials to log onto the device to add a user? Yeah, that is not the case. At this point, he decided to look for unauthenticated requests in the code for a more useful attack. He could have searched for privilege escalation vulnerabilities, then used the command injection. But, that seemed to require too many steps.

Decision Time

After searching for a while, he came to one particular function that did not require authentication that used an exploitable command without input validation. The process of finding/choosing this was not discussed in the video; but, I am sure this took a long time to find. I wish that this would have been discussed in a little more detail, personally.

Fun Time!

After identifying a vulnerable target, he encoded a telnet connection in his request. Even though it was a pretty simple command, it caught my eye. Typically, when I want to execute a command, I use keywords like cat or something. However, his request looked like this:

 
usr/sbin/telnetd -l /bin/sh -p 12345
The command uses the system binaries absolute path, to ensure that the payload is executed correctly. In this situation, I do not believe it is necessary. But on systems with poorly used alias's or changed commands, by executing the system binaries it is guaranteed that the payload will run. Note: the above payload also needed to be URL encoded for proper usage.

In order to test to see if the exploit worked, he runs a quick Nmap scan on the target NAS device. Once he sees the newly opened telnet port, he has successfully created a backdoor onto the network. From, here, simply connect, using telnet, to take files, steal credentials or whatever mischievous things there are to do!

Personal Takeaways

Overall, this exploit went through quite a few things! The hardest part was not finding a vulnerable request, because they all seemed to be, but finding an unauthenticated one.

Automation

Going through every single file to find all the command injection points just did not seem feasible. Further, decrypting every single file one by one also has a waste of time. So, in both cases, this was automated! Creating the automation step can take some time; but, in the long run this saves an immense amount of time. Automate, automate, automate...

The Flow

Understanding how the flow of program works is incredibly useful. This includes which parameters are needed for certain branches to be taken, the necessary values for the parameters, where the execution flow goes from file to file... There are so many parts that contribute to the flow of program. Without having an understanding of the flow, this exploit most likely would not have been found. This summer, while working at Faithlife, I traced an immense amount of code, attempting to debug it. I am very happy that I got this opportunity, as it has really helped my ability to understand how a piece of software works.

Another example of understanding the flow was when the PHP files were a garbled mess. Mr. Meyer knew that the files had to be decrypted at some point. So, he was able to figure out that the PHP interpreter itself was doing this.

Systematic

A few days ago I was playing around with OWASP Juice Box, which is a purposely vulnerable Web App to practice pen-testing on. After a few hours of trying to find a particular vulnerability I decided to see what I had missed. Well, instead of checking for XSS by bypassing the client-side validation on every request I tried on almost all. This mistake was made because I randomly decided what to test and how to test it. From this, I gathered that my penetration testing techniques needed to be systematic, automated when possible and intentional. Alongside being systematic means writing ideas, plans and scripts down for later usage/collection of thought also seems like a good idea.

By understanding what has been tested, and how, tests will not be skipped or duplicated. It is hard to see how many notes, ideas and other things are done doing the actual process of the penetration test. Over the summer, I took diligant notes about what I was doing, which helped me slow down and make better design decisions. But, it is safe to assume the documentation of the penetration test is recorded well. While I was watching this stream, I realized the importance of being intentional with a penetration test.

Another reason I bring this up was because during the presentation, Meyer seemed a little lost at trying to find a particular function towards the end of the stream. If he would have been more intentional about taking better notes, then this would not have happened on a live stream of this exploit.

Deep Breath Goes a Long Ways

As I mentioned before I was really taken aback by the solution to the PHP files being a mess of binary data. I personally would not have ever came to that solution at first glance. At this point, it would have been a very valuable decision to take a step back.To me, taking a deep breath to regather myself goes quite a long ways! If I do not do this, then I get pigeon holed on a certain solution to a problem. So, the real takeaway from this is to take a step back, take a deep breath and think about the problem from a different angle once I'm stuck.

Conclusion

At home, please give Joshua Meyer a hand! These sorts of videos/walk-throughs are very beneficial to the up and coming penetration testing community. Please keep up the good work Josh! Further, if you or an ISE employee is reading this article, please contact me! I would love to discuss this vulnerability and the process of discovering it more! My email is in the footer or on my github mdulin2.

ISE, I appreciate your companies work and would love to work at the company someday. The research your company does is quite fascinating! Thank you again for posting these live streams and hosting IoT village at DEFCON 2018.

Hope you learned something from this! I really enjoyed listening to a professional pen tester go through his mindset when testing something. I look forward to becoming a better pen tester and finding exploits myself! Cheers from Maxwell Dulin (ꓘ).



Analysis of ISE's Exploitation Video of SNMP

August 19, 2018

To start with, I want to give a huge shout out to the ISE(Independent Security Evaluators) for running the IoT village and the SOHOpelessly Broken challenge at DEFCON 26. I really enjoyed the village, contest, people and the talks! Please keep up what you're doing!

Today, I wanted to discuss the analysis and description of CVE-2018–12313 that was analyzed in this video by ISE researcher Shaun Mirani. An article describing the CVE can also be found here. I would like to thank the researchers at ISE for posting a exploitation video about this CVE. Not only that, but having a progression of how the device was actually tested. I personally find the hardest part of penetration testing to be actually finding the exploitable section, rather than exploiting it. Once the vulnerability is known, I believe it's typically trivial to do. This video gave me so great insight into how applications are actually tested, which is really appreciated! But, this could be an even better video.

I would even ask to take this a step further; to show a live stream of the entire penetration test on the device itself. Even though it would be quite boring and uneventful to do this, I believe this would provide so much benefit to the up-and-coming people in the IoT penetration testing area. Most of the time, I don't even know where to start with a penetration test! In the video, it directly goes into dealing with authentication; but, I'm not sure why they did that. So, by streaming the entire process the community could get a better understanding of the practices of good penetration testing. Even without this, the video was extremely insightful and awesome! I would like to see more videos like this in the future from ISE.

My Thoughts

This video was quite amazing at demonstrating the vulnerability. However, I'd like to set the bar even lower for people/ take notes for myself to use in the future. Below are a few takeaways from the penetration test. To start with, talking about basic terms to understand the situation should be done first.

Terminology

NAS(Network Attached Storage)

A good link for describing a NAS can be found here. Essentially, a NAS is a computer level, file storage system. The network aspect of it gives users access to files, regardless of where they're at.

SNMP(Simple Network Management Protocol)

This article from Microsoft says it really well by "SNMP, which is widely used in local area networks (LANs), lets you monitor network nodes from a management host...Using SNMP, you can monitor network performance, audit network usage, detect network faults or inappropriate access, and in some cases configure remote devices".


Continue Reading →

Faithlife Internship Review

August 17, 2018

Going into my last day at work, I thought it was appropriate to reflect on my previous 3 months at the company. Faithlife is an amazing place to work, constantly winning awards for the best company to work at. The internship program is no different. There are brilliant people all around, major opportunities for growth and fun projects to work on. I would love to give a huge shoutout to David Mitchell, the 13 year veteran of the company, for taking ownership of the intern program! It has been a fantastic learning experience to the point where I have virtually no complaints about the internship. Thanks for making this such a great experience.

The Program

Sixteen years ago, David Mitchell started off as an intern at Logos, which is now Faithlife. Bob Pritchett, the CEO, walked up to his desk on the first day, dropped a research paper on his desk and said "I like this; implement it". This is truly terrifying! As a college student, not having much of an idea of how to code in the real world, David was just had to figure it out—which was a great experience! But, could have been done better. Since then, David has taken ownership of the internship program; it's his goal to give the interns the best experience possible.

Mentors

Every intern is given a single person to mentor them throughout the internship. A single person! My mentor, Erik Lanning, has been an amazing contributor to my learning. Whenever I had a question, he would dropped whatever he was doing in order to help me. I felt that Erik actually cared about my success, even if he was quite harsh on me doing code review. However, Erik is harsh on me during code review because he wants to see me grow. Every comment on Github to rename something came with a nicely tagged article on why, giving me the ability to not repeat the mistake. This helped me grow at an extremely rapid pace during the internship. I believe that all of the mentors were kept to a very high standard by David, which turned out really good for the internship program.

1 on 1's are a really powerful way to give people feedback. As an intern, we're provided with an hour of our mentor’s time solely for feedback, to ask questions and for positive criticism and chat about the experience so far. The words "Max, you're quite sloppy with your code," really hit me like a truck during one week of the 1 on 1s. But I really appreciated the honest feedback! This led to me to more intentional with spacing, variable names, styling, organization, optimizing and much more. I would like to give another huge shoutout to my mentor Erik Lanning for investing 3+ months of time and effort into me. It's been such a great learning experience for me to work on the Faithlife Groups team.

Events

Do you not have many friends in Bellingham? Well, Faithlife has got you covered! They usually have two events per week for the interns. This includes go-karting, hiking, ice skating, escape rooms, boat rides and much, much more. With, of course, Faithlife paying for the event and the food.


Continue Reading →

Linux: File System

May 23, 2018

Linux! Man, this operating system (OS) is amazing! Instead of having virtually no control over the system (like MAC OS), we have access to everything! I hate in Windows that I cannot delete Internet Explorer and Microsoft Edge...And trust me, I've tried.
Linux, it's time to shine! Being able to edit anything we feel like can be amazing, but can also break the entire computer. So, be aware and wise about the changes made!

Background:

Open source is the whole reason the internet works. From the server in your house to the FBI's website, there's sure to be a piece of open source software somewhere in the mix of it. The term open source means to make a piece of software, then freely give it to the world! This has came the norm for the software development industry; and we're all very grateful for this. It's led to the invention of so many fantastic revolutions in the tech industry.

All hail to the king and chief Linus Torvalds! This wonderful man gave us the start of all Linux based operating systems today. Linus starting working on a MINIX (mini unix) system, which was a free operating system at the time with little features. However, he started to customize it for his own liking. Then, he realized that maybe all people would love to start working on his new operating system. In fact, Linux coming from Linus's MINIX system.

In Linux, everything is stored in files. From processes, to keyboard inputs; it's all in files! Which, can make for some really fun and interesting manipulations of the OS. So, being able to manipulate the operating system, understanding where different items live is very important for creating a good environment to work inside of.

The System:

At the very top, is the root. The root is at the very top of the hierarchical structure. Inside of root are a myriad of directories: a few of which will be talked about.

Bin:

Bin is simply short for binary. Inside of this file are the binary executable programs that make up the operating systems commands. Commands such as ping, cat, kill and every other command in then (that are not alias's) are inside of this folder. Typically, we don't want to alter these. But, if installing something, put the binary compilation of the file into the bin directory.


Continue Reading →

Linux Commands: Netstat

May 19, 2018

Netstat exists as one of the best, yet simplest tools for the checking what is happening on the network. It is extends from viewing basic operations, like monitoring TCP(transmission control protocol) traffic and does complex operations such as showing the statistics of ports or protocols.

Background:

As a pre-req for this tutorial, it's important to understand some networking basics. This is quite obvious because this is a networking tool in itself.

TCP/UDP/ports:

There are two main protocols that we will be looking at; TCP and UDP(User datagram protocol). These are the underlying protocols for the sending of most information on the internet.

TCP is used for when information MUST be in a particular order. However, because they must be in a particular order, when packets (or little bits of information) are received out of order, it just calls for another packet. This makes the protocol quite slow.So, that's why UDP was invented!

UDP is much faster than TCP; this is because it does not keep track of the order of the packets that are coming in. Even though this leaves room for issues with some packets not reaching the destination, there are situations where speed is more important thanhaving the packets in perfect order. A perfect example of this is streaming videos; the basis of the Netflix streaming is over a protocol based on UDP.

Ports are where the information is being sent to. This helps keeps information going into the same exact area, on a network, all the time. At this point, the ports are only symbolic and have 'conventions' as opposed to rules. The most common port, that people use everyday, is the https port(443). Even though it uses 443, this is only by convention. In theory, it can run on any port.

Networking is much, much more complicated than what is explained above. But, this should cover you for the tutorial on Netstat.

Basic Commands:

Below is how to display all TCP connections currently running.
 netstat -at


Continue Reading →