Wednesday, October 23, 2019

Examining Maptiles from iOS

Examining Maptiles from iOS

During my last class at Champlain College Online, I ran across something that was interesting during an examination of GPS artifacts on iOS.

I noticed a cache for MapTiles located Data\mobile\Library\Caches\MapTiles\MapTiles.sqlitedb. SQL database contains an images tables as seen below:

If you export the "data" column, you get a set of images:
These images are squares caches by the iOS map application. These maybe cached when viewing the map or using the directions.

If these are placed together, they create a map as shown on Google Maps. I used PowerPoint to combine these images.

Now, you can take this on Google Maps and find this location. Using the location information already found within the iPhone, we can find this location (See below showing Google Maps screenshot of Burlington VT at I-89 and 2A)


Matching maptiles can be extremely difficult if someone does not have more information. I had other information from the device showing it might have been in the Burlington, VT area which is where I started the search.


Trace Labs CTF Judge vs Member

Trace Labs National Australian Missing Persons Hackathon 2019

Previously, I had participated in Trace Labs virtual CTF on July 13 which my experience can be read at Trace Labs Global Remote 2 CTF. After finishing that CTF, I decided to be a remote judge for the Oct 11 CTF in Australian from the United States. One thing to remember, Trace Labs CTF are passive research contests. You can read the CTF Rules and the company and volunteers will refer to these rules during the CTF. Please remember one thing: do not contact family, law enforcement, or anyone else outside the CTF. 
The above slideshow are the states released by @TraceLabs  for the event. For the complete event (6 hrs), everyone generated 3912 total leads which is impressive.

@AustCyber posted some information as well about the CTF.

As for judging, I think a person who wants to judge needs some information on how to look at OSINT information. The person may not need to understand how to locate and search for the information through the tools such as the OSINT Framework; however, many of the pieces of information submitted were from the deep web, or websites not indexed by search engines or hidden behind a login wall. Understanding OSINT techniques will help with both participating and judging these CTFs.

Should someone judge before participating? I think that is up to the particular person. Personally, I was better prepared with what information was being gathered and how that information would be presented on the platform from an investigator. Although since the platform performs relatively the same for judges and investigators, there is no harm in judging before participating in a CTF.

I think judging remain easier to work on since many of the investigation techniques are not required and exposes a person to how to perform OSINT through the passive techniques.

I find OSINT interesting but I need more experience at it. For those looking for some OSINT puzzles, you can follow @quiztime on Twitter which posts puzzle from a bunch of people. These posts are mainly video and imagery OSINT.

Monday, September 30, 2019

Conference: Cyber Security Summit (Charlotte) 2019

This month I had the pleasure of attending the Cyber Security Summit in Charlotte at in The Westin on September 17, 2019.

The summit was a single day connecting "C-Suites & Senior Executives responsible for protecting their companies' critical infractructures with innovative solution providers and renowned information security providers" (from the site). The conference was sponsored by ExtraHop, a cloud network security company.

For those that have never been to one of these, I find the single day and single track format fairly good. The conference focus on C-Suite and Senior executives led to many of the talks throughout the day just outside the realm of technical which worked well for these type of conference; the same focus was also seen within the vendor hall.


ExtraHop's John Smith  gave the keynote and the first talk of the day which centered around the problems the volumes and diversification of our data. From 2005 to 2025 (theorized), we would have increased the amount of data stored by 43%. The level of volume of stored day causes concern for SOCs trying to perform incident response and threat hunting. They hit on a big topic all day: machine learning. They give information on how to prove SOcs today by adding machine learning to improve human analysis. Machines crunch numbers and large amounts of data but finding anomalies can be difficult, and at least for now, requires human touch.


Next, DarkTrace's Marcus Fowler talk about the evolving threats through the use of machine learning and artificial intelligence. Threats today continue to increase in speed and sophistication while the access to advance tools are common place for many entry level cyber criminals. DarkTrack went on to talk about their platforms: The Enterprise Immune System and Antigena. These platforms are named after the human immune system and antigens. For the immune system, it was explained the system learns in real-time with no prior bias. It learns what is 'normal' and detects threats on the fly while surfacing high priority threats to teams. When using it with the Antigena, enterprises can get autonomous responses. This means the system can change security rules on the fly or repeat and action depending on pre-configured rules, learning configuration, or response from an analyst. There is also a mobile application that could be used to take action on security risks in a network from activating a security change or calling a incident response team.


IBM Security's David Cass spoke about moving and security information in the cloud. One of the biggest things mentioned was adapting the cloud based on business value due to workload not to save money. He listed several key concerns:
  • protected from latest threats
  • critical data protected
  • adapting to the platform
  • requiring specific skill-sets
  • maximizing security value
  • transparency with leadership
He also spoke about the difference of data sovereignty vs data residency (not the same thing). Then continued to talk about business continuity and disaster recovery for cloud environments and stressing proper testing and planning. He makes a point about understanding the journey to the cloud and the proper way to plan and deploy applications to the systems.


@RISK Technologies's Sean O'Brien gave a talk on Identity & Management (IAM) and Governance centering around their IDMWORKS platform. They deep dived into a 5 level diagram through how to get to a mature IAM model. After understanding the model, they moved to how to assess the IAM system for two reasons (from the slides):
  1. to determine the capabilities needed to ensure the right people get access to the right resources at the right times for the right reasons (the why)
  2. practical, structured and coherent approach to the management of users' identities and their access to systems and data (the what)
He then goes into a talk about how to maintain the IAM program through governance and ensuring the whole organization is directly involved in the process.


There were three different panels as well which were a bit harder to follow for notes in this format (also some technical problems with mics). There were a wide range of questions and answers as well. I should have taken some better notes during these sections.

The vendor hall had numerous vendors; I thought it was alright but it was a bit repetitive. Most of the technology companies there had demos setup centered around dashboards and information systems where notifications showed up and to be acted on. I figured it must be the age of dashboards. There was also training and certification companies and other random things. A list is located on the website under sponsors and partners. Also, for those with training requirements, there was a 6 CEUs given out as well if you remained throughout the day.  


Tuesday, July 16, 2019

Trace Labs Global Remote 2 CTF

On July 13, Over 200 persons (which including myself) participated in Trace Labs's Global Remote 2 for missing persons. This was my first CTF for cyber security or intelligence and it was enjoyable.

Update (2019/07/18): @raebaker put together a great overview of how the CTF works, the dashboard, and the overall feel during the CTF. You can view his write up on Medium: Finding Missing People with Trace Labs CTF.

For the CTF, I used Buscador OS as my primary method of researching each case. Buscador OS provided many tools built in and provided already configured browsers (Chrome, Firefox, Tor) to conduct different types of research. I mainly did research by hand instead of using the tools due to my lack of knowledge on both techniques and experience at the tools.

The book Open Source Intelligence Techniques by Michael Bazzell (site) provides good information for OSINT researching. Some of the tools are no longer available from the website but are shown in the book. You can read information about the tools from the IntelTechniques Forum post.

Tools Used

Spiderfoot - an automation OSINT tool

Spiderfoot provides a wide range of OSINT modules built within a python framework. It uses a full list of modules

Module List from Spiderfoot (subset)
Someone could run any or all of the modules. I find it generally works better with location research on domains, companies, and other research materials then actual people. It has a lite Google and Bing search tool; however, searching each actual site yields more information for this CTF.

Sherlock - Locate UserNames across Social Media

Sherlock was not preinstalled in Buscador OS. I did use it through the docker image with the provided information on the Github page. This tool was very interesting but limited as well. You give it a username say "hunchly" and it will attempt every social media site within its list for the same username. This can local username used by one person at different sites.

I did find this one was very hit or miss. It does provide a quick way to check other sites without having to query them directly and provided a very easy to use python script to perform the task.

Skiptracer - OSINT scraping framework

Skiptracer provides a way to search for phone, email, screen names, real names, addresses, ip, hostname, and breach credentials. I find that for searching persons is useful only in the US then internationally.

Some other team members used this more then I did. I did take a look at the tool and see how it worked. Knowing a few pieces of information may yield more through this command line, questioned driven tool.

Thoughts on the CTF

It is staggering the number of missing persons around the world. Some of these stories really hit home when you watch or read news information about a missing person leading to want to find information about this person.

During this CTF, my team had minors, cold cases, international and domestic cases. Each one of these presented its own challenges when locating information on a missing persons. I enjoyed the learning curve which was very high and demanding. The community was responsive to questions about tools and information about OSINT in general; they were also responsive to the information about cases after the end of the CTF. 

The over all experience was well worth it. I only wish I could have completed on in person and really worked to figure out the processes to directly help in support of missing persons. 

Monday, July 1, 2019

DFIR OS Tsurugi

There are plenty of DFIR OS out in the wild. SANS's SIFT workstation, Sumuri Paladin, and Digital Evidence & Forensics Toolkit (DEFT) are probably the best well known ones. In my previous college class, I was shown an OS called Tsurugi. Tsurugi can be downloaded from their main page at https://tsurugi-linux.org. It is named after a legenary Japanese double-bladed sword used by ancient monks which might be something similar to the image below.

The OS comes in three different types based on Ubuntu 16LTS:

BENTO is the live DFIR distribution similar to Sumuri Paladin and the like. This gives a full set of DFIR tools and write blocker. TSURUGI Acquire provides a light weight version of LAB version for acquiring forensic images. TSUGRI Linux [LAB] provides a complete DFIR suite that can be installed on a computer or VM.

I have yet to really test out BENTO but I did perform a capture with TSURGI Acquire. It used Guymager as the imaging software. The imaging process was straightforward using the tool and with the write blocker in place, it is easy to ensure nothing is written back to the source hard drive. 

This shows the main screen of Guymager when you load it up. The screenshot is from the Guymager site.

I installed the LAB version inside of VirtualBox 6.0 using a red install icon on the desktop which starts up a graphical installer. It also requires you unblock the proper drive to install the Linux version to. Installation was average for wizard installation process for Linux. During the installation, I set to autolog me on. Probably should configure it to not ask for the password when running stuff as 'su' since this is only a lab computer similar to how SIFT works.

After installation, you get the main screen:

From here, all the tools are located under the application menu. 

The menu is catatorized by the type of tool (some tools appear in more then one menu). The full list of tools are located here: https://tsurugi-linux.org/documentation_tsurugi_linux_tools_listing.php#

The major issue that I've had with the system is running 'apt' to update the system. I found that if you end up running apt or similar updates in the system it breaks some of the python modules. I recommend people keep a snapshot after installation in case stuff breaks. 

Other then the update problem, I find it works really well for running any CLI or GUI tool that is provided. The conky window on the right adds notable stats when you are working in the box as a quick reference and works well. I have not had a chance to use all the tools, but it runs similar to other Ubuntu 16 LTS operation systems.

Sunday, May 19, 2019

Problems with Sift Workstation on Qubes OS 4.0

For a while now, I have had issues with Sift Workstation in Qubes OS VM. You can read about my issue on the sift-cli github: teamdfir/sift#357

The sources.list file after I run the sift install/upgrade is as follows:

$ cat /etc/apt/sources.list
deb http://archive.ubuntu.com/ubuntu xenial universe
deb http://archive.canonical.com/ubuntu xenial partner
deb http://archive.ubuntu.com/ubuntu xenial-security multiverse
deb http://archive.ubuntu.com/ubuntu xenial-updates main universe multiverse restricted
deb http://qubes.3isec.org/4.0 xenial main
deb http://archive.ubuntu.com/ubuntu xenial multiverse

Notice the first line. It should be 'main universe' instead it happens to be just 'universe'. This is wrong and you will get failures.

For some reason, I did not consider usig a 'custom' repo file for this missing configuration. Created a new file located at: /etc/apt/sources.list.d/ubuntu-mail.list

It contained one line: deb http://archive.ubuntu.com/ubuntu xenial main

Then, I ran the 'sift update' command again with everything working as intended.

Wednesday, May 8, 2019

Hardware for a Digital Library

Update

I decided to go with the large format e-Reader, Onyx Boox NotePro. My reasoning is it was nearly half the cost of the 2-in-1. Using it over the last couple of weeks, I find it easy to take notes on and read. There have been a few instances where I ended up stuck with the UI not responding the way I anticipated but might have been more user error with a new device then then device itself.

================

A while back I talked about maintaining a digital library. A personal library might have blog posts, digital books, and digital notes. A good resource when looking for digital information is DFIR Training webpage. However, having this information would only be useful to carry around.

I currently started my Master's at Champlain College Online in DFIR. During this process, I realized that I required a digital library because I travel for work. I do already have a Pixel 2 XL, Razer Blade Stealth (mid-2017, gray), and a Kobo AuraONE. Each of these devices are useful; however, none of these truly solve my problem. The screen on my Pixel is good for reading in short burst and research many topics but reading PDF or any detailed images (such as the SANS.org cheat sheets) is horrible. Razer Blade Stealth is a bit large in some situations. I realize it is a great  ultra book. Reading on it has the same issue as the Pixel. For e-readers, I swear by the Kobo Aura ONE (sadly has been discontinued). It is a great reader in most cases. However, it does not read textbooks or larger PDFs properly. I also have one of the smaller storage devices.

What chooses did I look at? I was looking at a 2-in-1 computer or pro e-reader.

For 2-in-1 computer, the standard is Microsoft Surface Pro 6 which provides a nice tablet for those running Windows. I am not in love with the keyboard. It does work well, has a nice screen, and works well for note taking. However, I am a Linux user normal (running Qubes OS 4). The Surface does not play well with Linux from what I can tell. Next, I also looked at the Lenovo X-1 Tablet which is bigger then the Surface. It also does not work 100% (probably 90%) with Linux. It is cheaper then the Surface; however, according to some reviewers, the trackpad is just 'ok'. Lastly, I also looked at the Eve V, crowd-sourced tablet which like the X-1, it comes with a keyboard and stylus while remaining cheaper then the Surface. Eve V, though, still has some issues with completing orders and hardware problems out of the box (< 5%).

For e-readers, there two main companies Sony and Onyx which make professional e-readers. Sony e-readers ONLY read PDFs which was a killer for me. Onyx has several which run full version Android which include the Google Play Store or whatever apps you choose to side-load.

In the end, I went with the Onyx BOOX NotePro. It was less then half the cost of the 2-in-1 computers while providing a good note taking and reading experience (not nearly as good as the Kobo AuraONE). It maybe a bit heavier then the Sony products but it's fairly light and even lighter then the Razer Blade Stealth. In the short time that I have had it, I find it reads PDFs really well and the epub experience is fairly good. The main drawback is that the reader turns completely off after about 30 mins of inactivity -- great for battery life but hard on shorter burst of reading throughout the day.