(My) BruCON 2015 notes (4)

Here are my quick notes from the BruCON 2015 conference. All the slides can be found here.

The malware is just code so, as any other code it is possible (in theory) to analyze/reverse engineer it manually.

The triage is one of the functions of the incident response program and must answer the following three questions regarded to a specific input:

  • is the input malicious ?
  • if yes, what is exploiting ?
  • are we exposed ?

Triage is not malware analysis and should be quick and efficient. The triage workflow:

  • passive analysis.
  • first interaction and download.
    • some malware are crafted to be able to interact with the initial URL only limited number of times
    • some malware could profile your browser, check the browser version, platform, or use the user agent script to decide if the exploit can be executed or not.
    • some tools:useragentstring.com (to check your user agent), onlinecurl.com (on-line version of curl, copy paste a url and you get back the response), hurl.it (idem as previous one).
  • web component analysis.
    • once you have the web component (which is typically an html page + JavaScript) you could try to analyze it.
    • use jsBeautify.org to try to have something human readable in case the code is obfuscated.
    • try to use the browser debugger, eventually change JavaScript eval expressions.
  • exploit analysis.
    • can use showmycode.com to understand the exploit; it is capable to decompile Java, Flash, .Net, PHP
    • sometimes you can blindly search the metasploit exploit template library
  • payload extraction.
  • payload analysis.
    • can submit the file/s to VirusTotal or malwr (virtualized Cuckoo instances).
    • malwr can give you infos about the registry keys created, network traffic.
  • build IOCs (Indication Of Compromise).
    • collection of indicators which can be used to describe a compromised system.

This was an workshop, so the participants had to play with some of the tools. Here is the quick workflow that i followed:

start from a url -> use the onlinecurl.com to get the response (initial interaction) -> saved the response on a file and used to browser debugger to understand what the component is doing (web component analysis)-> get from the JavaScript another url that contains a link to a Java .class file -> use it showmycode.com to decompile the class file (exploit analysis)-> write some Java code to decode parts of the exploit and execute it on ideone.com (payload extraction)-> …time over :(.