Intrepidus Group

Author Archives: jross

Getting Mallory to Run in Modern Versions of Ubuntu

Posted: July 19, 2013 – 12:49 pm | Author: | Filed under: Uncategorized

It’s been a while since we’ve talked about Mallory, and there’s a fairly common installation problem that comes up so we thought it’d be a good idea to address both of those things by posting some guidance to getting Mallory running on current versions of Ubuntu.

If you aren’t familiar with Mallory, and you do pentesting of mobile (or embedded?) devices, it may be worth looking into. While it is not perfect, it can be very handy when dealing with tricky things like required testing on unrooted devices, or in situations where you can’t set up an HTTP proxy but can setup some other kind of connection to a LAN gateway (WiFi, ARP MiTM, routing table tricks, whatever).

To use the description we have on the introductory post:

Mallory is an extensible TCP/UDP man in the middle proxy that is designed to be run as a gateway. Unlike other tools of its kind, Mallory supports modifying non-standard protocols on the fly.

Here’s a screenshot of what the GUI looks like when running:



A screenshot of Mallory running – with a PayPal HTTPS request displayed.

Looks great, right? Well, it is pretty cool, but it’s got some rough edges.

One of the more frequently asked questions we get is “how do I install this?”. A close second is “I got the code on here, but it crashes when I try to start it up”. To address the first question: we made a lot easier to get Mallory running through the creation of an installation/update script we wrote (available from the Install Guide on Bitbucket). However, even that fails to get a running instance together on new versions of Ubuntu; thanks to library pathing fun.

When you try to run Mallory after using the script to perform an installation, you’ll probably get an error that looks like this:

$ sudo python ./
Traceback (most recent call last):
File "./", line 87, in
import netfilter
File "./mallory/current/src/", line 7, in
from   pynetfilter_conntrack import Conntrack
File "/usr/local/lib/python2.7/dist-packages/pynetfilter_conntrack-0.4.2-py2.7.egg/pynetfilter_conntrack/", line 4, in
from pynetfilter_conntrack.func import *
File "/usr/local/lib/python2.7/dist-packages/pynetfilter_conntrack-0.4.2-py2.7.egg/pynetfilter_conntrack/", line 6, in
library = cdll.LoadLibrary("")
File "/usr/lib/python2.7/ctypes/", line 443, in LoadLibrary
return self._dlltype(name)
File "/usr/lib/python2.7/ctypes/", line 365, in __init__
self._handle = _dlopen(self._name, mode)
OSError: cannot open shared object file: No such file or directory


Checking the installed packages, you’ll find that the python library for conntrack has been installed, so what’s the problem?

The issue is that Ubuntu (and derivitives of it, like Mint) puts libs in places where Mallory (and Python) don’t expect them to be. In the case of 64 bit versions of the OS, the libnetfilter_conntrack libraries are kept in: /usr/lib/x86_64-linux-gnu/

For 32 bit versions the libraries are in: /usr/lib/i386-linux-gnu

There are probably a couple of different conntrack related files in there, usually 2 or so that are symlinked to the real library (at the time of this post, the real one was:

So, how do you fix the problem? The easiest solution is to simply add a new symlink to the real library, in /usr/lib. That can be done by running the following command:
$ sudo ln -s /usr/lib/x86_64-linux-gnu/ /usr/lib/

Once you’ve gotten the link set up, you should be able to run Mallory without further problems.

$ sudo python ./
[*] [2013-07-18 09:25:21,334] INFO:Logging setup complete
[*] [2013-07-18 09:25:21,335] INFO:ConfigRules.init: LOADING RULES.

Comments disabled

Kony 2013 – A different kind of Android reversing

Posted: June 6, 2013 – 3:39 pm | Author: | Filed under: Uncategorized

We reverse engineer Android applications pretty much daily here at Intrepidus. We do it so much, in fact, that it almost becomes rote:

  1. Grab the APK
  2. Run it through apktool/dex2jar/apkanalyser/apkREthingDuJour
  3. Look at the files/smali/classes
  4. ???
  5. Profit!

It’s rare that we find something that breaks that process (I left out the MiTM of traffic, because I’m focusing strictly on the APK analysis for this post).

A recent engagement we did for a customer turned out to be one of those rare finds: an APK that was “weird”. While ethics preclude our talking about a customer’s application specifically, we were able to find other applications in the Google Play store that have the same behavior, and we can use them =)

For this post, we’re going to be looking at the Scottrade Mobile application (you can get the specific APK we used here). To begin with, we fire up the application on the device. A picture of the main activity is shown below:


Nothing unusual so far. We click on the Account icon at the top of the screen, and as expected a login activity is presented:




Fair enough. We click around a bit. Type some random strings into the account number, check the Remember Account No. checkbox, etc. All we’re looking to do right now is make sure that any client side storage options are being exercised – so that when we take a look at what data is persisted by the application, there will be something there.

Having walked through the app a bit, it’s time to check the file  system on the device. In this case, the application data is stored in /data/data/com.konylabs.Scottrade. We grab them using adb pull. Once we have the APK unpacked back into smali and java classes, and the local files on the device have been pulled out for examination, it’s time to move on to seeing how the application works.

The first stop we make is the AndroidManifest.xml file. This file is used by the APK to define the package characteristics – including all permissions used by the app, all the activities and service providers the application creates, etc. Basically this file contains a giant roadmap of the application. Here’s what the manifest for this application looks like:


Hrmm.. our first hint at oddness. There’s not much in this manifest – especially for an application that has as many components as this one does. We already know from walking through the application in run-time that there are many activities present, so how can there only be two defined in the manifest? Looking at the main activity (identified by the <action android:name=”android.intent.action.MAIN” /> intent-filter), we see that it maps back to the .Scottrade class. Since the package name is com.konylabs.Scottrade that means we need to look at the com.konylabs.Scottrade.Scottrade.smali file to examine what’s going on. Here’s what that file contains:



Again, more oddness. We’ve seen the main activity. We know it’s got a lot of stuff going on – stock trade graphs with information presumably obtained from the Internet, various images loaded, etc. In fact, the app looked a whole lot like the usual bunch of webviews strung together. So why aren’t we seeing any of that here?

The answer is in line 2:

.super Lcom/konylabs/android/KonyMain;

What this translates to in Java is a class inheritance. Inside the file would be something like the following:
public class Scottrade extends KonyMain {
    // code goes here


And if we look at the Scottrade.class file in JD-GUI, we see that’s exactly what happens (and in fact, not much else at all):




So, where does KonyMain live? It’s at That file is a whopping 5166 lines long! Using dex2jar on Windows 7, and JDK version 7, the KonyMain class file only has a single line of: // INTERNAL ERROR //

However, using dex2jar on Linux, with JDK version 6, we get a successful decompilation:



Again, this file is pretty huge, and not terribly helpful. At this point, it seems like we should look at what Kony Labs is. Turning to Google, we see that they are a mobile application company, and offer products that provide a framework for developers such that they can take code they write, and deploy it across multiple mobile platforms. So, what that means for us as reverse engineers is: we need to figure out how the Kony Labs framework is designed. Unfortunately, their application is not open source, nor is a  trial available. While looking for further information we ran across this site, explaining a bit about the Kony architecture. Specifically:

KonyOne, claims it does everything. Their Eclipse-based cross-platform IDE studio helps the creation of a single code base programmed in the Lua programming language (like JavaScript) from which Kony generates native code in 7 OSs OR HTML5 OR WAP/WML browser apps.

Hmm. That’s interesting. So, it would seem that Kony takes application code written by mobile developers, and throws it all into a Lua wrapper of some kind. Armed with that information, we started looking harder at the structure of the APK, and the files stored in the device file system.

First off, we looked at the device file system structure:



We see the usual things: cache for webviews, shared_prefs, databases. But files is odd. Looking in there, we see 3 files:


What is a .kds file? Running the file command on them says they’re Java Serialization data:
$ file *
dsAcceptDecline.kds: Java serialization data, version 5
dsAppVersion.kds: Java serialization data, version 5
dsShowStreaming.kds: Java serialization data, version 5

Examining the files in a hex editor doesn’t help much, but we can tell it’s got Java Hashtables, and binary objects in it:



Hrmm… OK. Ignore that for the moment, let’s see what else there is to find. Going back to the unpacked APK directory we obtained using APKTool, let’s check out the usual spots:

  • Anything good in /res/values/strings.xml?
    Nope: strings_xml
  • How about /assets?
    Aha! Something interesting here:


Knowing that Kony wraps everything up in a Lua wrapper, that konyappluabytecode.o.mp3 file looks somewhat suspect. Let’s take a look at the file header and see what it has to say:


As guessed: this is not your usual MP3 file at all! It’s a LUA bytecode dump! Here’s where things start getting really interesting.

It turns out, that the entire Android “application” is actually stored inside the konyappluabytecode.o.mp3 file. Running strings on that file shows evidence of this:


Unfortunately, the Lua bytecode isn’t readable natively, because it’s bytecode. Some  looking around revealed a few tools that can be used to disassemble Lua code, but the one that worked best for us was ChunkSpy. This handy tool is actually a Lua script, that can be used to convert binary Lua bytecode back into a verbose listing, similar to what you would expect to see in a standard binary debugger. Running the konyappluabytecode.o.mp3 file through ChunkSpy was simple enough, once we installed lua 5.1 onto a Linux system. The usage is shown below:



For our bytecode, all we had to do was run the following command:

ChunkSpy.lua -o out.txt konyappluabytecode.o.mp3

An example of what the output looks like is shown here:



Where things went from there is the subject of another post, coming soon…



Comments disabled

Unhosing APKs

Posted: October 5, 2012 – 5:35 pm | Author: | Filed under: Uncategorized

Recently, there has been some discussion in the press about a tool named “HoseDex2Jar”, which claims to prevent wily hackers from being able to decompile Android APK files back into Java class files. Such a tool would be very useful; it is claimed. It is purported to prevent said hackers enacting their nefarious actions and protect assets within an Android applications developers are trying very hard to protect. In fact, the president of the company offering the service is quoted as saying, “We realized if there was a way to stop Dex2Jar, we would stop all Android decompilation.”.

While we think it’s great to see folks moving in a direction to raise the security bar for mobile applications, claims such as this are a bit misleading. To understand why, it’s important to realize that there are a couple of very different things meant by “decompiling” when it comes to the Android platform.

The HoseDex2Jar tool claims to prevent people from running the tool “dex2jar”, which is a utility that converts the Android APK “classes.dex” file back into Java classes. The java classes can then be turned back into readable Java code using any one of various Java decompilers available (there are both open source and commercial versions of such tools).

What the HoseDex2Jar tool does not claim to do, is prevent the conversion of the classes.dex file back into the Dalvik disassembly, called smali. While it can be handy to convert APK files back into Java classes – because Java can be easier to read than smali - we’ve found that using the smali code tends to be provide better results. That is because, the smali code is more accurate, and because it is essentially Dalvik disassembly,  it is almost always easier to manipulate and alter when attempting to develop a meaningful attack. When converting APKs back into Java, the decompilers need to make a lot of guesses about the code, which results in a distortion from the original application. It’s useful as a quick overview, but it’s generally not able to be altered, and then successfully recompiled back into a new APK. (We wrote about this a bit a couple years ago.)

Because HoseDex2Jar only prevents the flawed “to Java” version of Android decompilation, completely ignoring the much more accurate decompliation into smali code,  the claim that HoseDex2Jar prevents all Android decompilation is not only completely false, it’s almost useless as a security measure.

As an example of how futile this is, we used the HoseDex2Jar tool to “protect” a small app we wrote. We registered with the web service, and obtained our “hosed” APK back. Sure enough, running dex2jar on the hosed APK failed, with a Java null pointer error:

dex2jar info.dc585.SMSniffer-1-HOSED-MUST-RESIGN.apk -&gt; info.dc585.SMSniffer-1-HOSED-MUST-RESIGN-dex2jar.jar

at org.objectweb.asm.Type.getType(Unknown Source)
at com.googlecode.dex2jar.v3.V3ClassAdapter.visitField(
at com.googlecode.dex2jar.reader.DexFileReader.acceptField(
at com.googlecode.dex2jar.reader.DexFileReader.acceptClass(
at com.googlecode.dex2jar.reader.DexFileReader.accept(
at com.googlecode.dex2jar.v3.Dex2jar.doTranslate(

However, baksmali had no problems, nor did apktool.

apktool d -d info.dc585.SMSniffer-1-HOSED-MUST-RESIGN.apk
I: Baksmaling...
testI: Loading resource table...
I: Loaded.
I: Loading resource table from file: C:\Users\user1\apktool\framework\1.apk
I: Loaded.
I: Decoding file-resources...
I: Decoding values*/* XMLs...
I: Done.
I: Copying assets and libs...

More interestingly, simply using apktool to repackage the directory, without making a single change to smali files, is also successful:

apktool b -d info.dc585.SMSniffer-1-HOSED-MUST-RESIGN info.dc585.SMSniffer-1-DEHOSED.apk
I: Checking whether sources has changed...
I: Smaling...
I: Checking whether resources has changed...
I: Building resources...
I: Building apk file...

And guess what? After running the two very simple commands above, we can now simply use Dex2Jar again, despite the fact that this APK was initially hosed:

dex2jar.bat info.dc585.SMSniffer-1-DEHOSED.apk
7 [main] INFO - dex2jar info.dc585.SMSniffer-1-DEHOSED.apk -&gt; info.dc585.SMSniffer-1-DEHOSED.apk.dex2jar.jar


Edit: So, it turns out that unhosing is even easier than we initially thought: if you use versions of dex2jar <, the hosed APK is able to be decompiled back to Java classes as though no changes occurred at all. It appears that the company offering this service only tested their success on the current version, and not on prior incarnations.

Comments disabled

Fun With The DefCon 20 Ninja Badge

Posted: August 6, 2012 – 11:09 am | Author: | Filed under: Uncategorized

This year marked the 20th anniversary of the DefCon conference. Each year at DefCon, there’s a mad dash from attendees to try to gain access to what many perceive to be the best party during the con. The party is hosted by a group of folks known as Ninja Networks, and for the past 3 years, access to the party was granted through obtaining a custom badge, known as the Ninja badge.

Sadly, the folks that comprise Ninja Networks announced (in this post on the DefCon forum) that this would be the last year of the Ninja party. To celebrate, the ninjas went all out when they came up with badges this year. This post is going to talk about the ninja badge, and how we were able to earn one this year.

A Bit About the Badge

The Ninja badge this year was an HTC One V Android smartphone. That’s right: by obtaining one of the badges, you were handed a mobile device that costs roughly $300. The ninjas wrote their own custom ROM for the devices, as well as multiple applications to be used during the conference. The applications included things like a game that could be played with other ninja badge holders, a centralized chat application, and an app which communicated with a vending machine via bluetooth to dispense beverages to badge holders.

That’s all very cool, “But wait! There’s more!“.

NinjaTel Logo

A Hacker Cell Network

The ninjas didn’t stop at making a cool badge this year. In addition to having custom smartphones the ninjas set up their own cell network called NinjaTel. This cell network allowed the ninja badges to communicate with other badge holders during DefCon. Badge holders could call, IM, and send private messages to each other.

So amazing was this setup, that the Wall Street Journal has a post on their blog about it. The NinjaTel network was created using USRP devices - set to GSM frequencies. Voice traffic was managed by an  Asterisk VoIP infrastructure. The contact database on the phone was synchronized with a central directory maintained by the ninjas, and each person that got a badge was registered with the NinjaTel network. The whole shebang was managed and run out of a cargo van parked in the DefCon Vendor area.

Needless to say, being interested in all things mobile, we were drooling when we found out about all of this. We made it a personal goal to do everything we could to get one of these badges to play with. For those that aren’t aware, the ninja badge is intended to be a social thing. They are given out to friends and family, and to people that the ninjas feel contribute back to the community.

Since none of us knew any of the ninjas personally, chances were slim that we’d be able to get one. After about a day of talking to everyone we knew that had a ninja badge, it was clear that the only way we were going to get one was by somehow earning it. Fortunately, a good friend was willing to lend us hers for a while.

First Impressions

Once we had one of devices in our hands, we disappeared into our hotel room for the rest of the day, to begin hacking. The first thing we did was reboot it. We were immediately floored by how nifty the ROM was. The ninjas created their own boot animation, complete with Pat Fleet welcoming us to NinjaTel.

Once the device had booted up, we immediately powered it down and booted it again – this time into the bootloader menu (by holding the VOL-DOWN button while powering the device on). The device was running an HTC dev build, and the bootloader was locked.

Because this wasn’t our own badge, we were hesitant to do anything too disruptive, like unlocking the bootloader and flashing recovery with clockworkmod or similar. Instead, we brought the device back up, and dumped the /system and /data directories via ADB. We had no problems getting the device recognized by ADB, but it seems that others needed to add entries to their configuration. Travis Goodspeed posted the following settings to pastebin:

#NinjaTel HTC One V 0bb4:0ce5, 0bb4:0ff9, 0bb4:0ff0
#The first allows for adb, while the latter two are needed for fastboot.
SUBSYSTEMS=="usb" ATTRS{idVendor}=="0bb4" ATTRS{idProduct}=="0ce5" MODE:="0666" SYMLINK+="HTCONEV"
SUBSYSTEMS=="usb" ATTRS{idVendor}=="0bb4" ATTRS{idProduct}=="0ff9" MODE:="0666" SYMLINK+="HTCONEV"
SUBSYSTEMS=="usb" ATTRS{idVendor}=="0bb4" ATTRS{idProduct}=="0ff0" MODE:="0666" SYMLINK+="HTCONEV"

Hacking the Badge

We then spent the next few hours dissecting the various ninja applications, and poking around on the system. We quickly came to the conclusion that the device was running Android ICS, because we noticed the presence of the directory /system/etc/security/cacerts/  (this directory doesn’t exist prior to ICS, in earlier versions the certificate store is in a BKS formatted file at /system/etc/security/cacerts.bks). It didn’t seem likely that the devices would be running JellyBean, and this guess seemed confirmed when we observed the following entry in the /system/build.prop file:

This was all neat, but none of it was helping us to earn a badge. The ninjas had created a very cool launcher, but there was no apparent way to get to any apps other than those built into their custom UI. Similarly, there was no way to access the system settings. A picture of what the main screen looked like is below:

ninja badge home screen

It struck us as odd that there would be no mechanism to access settings, so we looked a little deeper to see if there was some hidden way of accessing these items.

Android Hidden Codes

One of the tools we created for our testing checks the device for applications that have configured secret codes. Secret codes can be used by applications to launch intents. Access to the intent is gained by dialing a specific string of digits in the dialer application. For example, many Android devices contain an information activity which can be accessed by dialing *#*#4636#*#*.

An application registers secret codes by using the following elements in the AndroidManifest.xml file:

<action android:name="android.provider.Telephony.SECRET_CODE" />;
<data android:scheme="android_secret_code" android:host="3333" />;

Running through all the applications on the NinjaTel device didn’t show any helpful secret codes. Just the usual Google and HTC specific codes were present:

CheckinProvider.apk *#*#682#*#*
CheckinProvider.apk *#*#682226#*#*
CheckinProvider.apk *#*#682364#*#*
CheckinProvider.apk *#*#682668#*#*
FieldTest.apk *#*#7262626#*#*
FlexNet.apk *#*#361066#*#*
FlexNet.apk *#*#361166#*#*
FlexNet.apk *#*#362066#*#*
FlexNet.apk *#*#366633#*#*
FlexNet.apk *#*#36666#*#*
FlexNet.apk *#*#3688633#*#*
FlexNet.apk *#*#368866#*#*
FlexNet.apk *#*#7669633#*#*
GSD.apk *#*#3424#*#*
GooglePartnerSetup.apk *#*#759#*#*
GoogleServicesFramework.apk *#*#2432546#*#*
GoogleServicesFramework.apk *#*#46#*#*
GoogleServicesFramework.apk *#*#7867#*#*
GoogleServicesFramework.apk *#*#8255#*#*
GoogleServicesFramework.apk *#*#947322243#*#*
Phone.apk *#*#2347#*#*
Settings.apk *#*#2900#*#*
Settings.apk *#*#29000#*#*
Settings.apk *#*#2911#*#*
Settings.apk *#*#29111#*#*
Settings.apk *#*#4636#*#*


Since we couldn’t really do a whole lot through the device UI, any ideas we came up with to try to earn a badge required us to be more hands-on with the device than we were willing to risk with a friend’s phone. Disheartened, we quit for the night at around 2am.

When All Else Fails, Code Your Own Way

Friday was pretty busy for all of us, so we didn’t get much of a chance to play with the badge. We heard a lot of friends that had one asking about what version of Android the device was running, and wondering how to get to system settings and other apps. We also heard about an OTA update that NinjaTel was performing, and a lot of folks trying to guess whether or not they had received it. It occurred to us about halfway through the morning that writing a simple app to launch system settings – and showing which version of NinjaTel the device was running – could be useful to a number of folks. Since the ninja badge had a dialer app, this launcher could be called by registering our own Android secret code. Once things calmed down a bit, we sat down to do just that.

We wrote up a very simple app: just one activity, and a BroadcastReceiver (to catch the secret code when the dialer broadcasts it out). The activity had 3 buttons, one to bring up the system settings, one to launch Facebook, and one to launch Google Music. We set it up so that the app would launch when the user dialed *#*#303#*#* from the dialer app. Here’s what it looked like once it was running:

Mission Accomplished!

We wanted to show the OTA version information on the screen as well, so we went up to the NinjaTel booth in the Vendor area, and showed them what we had, and asked where the version information was being stored. They told us, and then asked if we wanted a badge! Of course, we said yes – how could we not?

Now that we had one of our very own to play with, we could appreciate more the effort that went into creating the ninja badges. Among other things, the ninjas had created their own SIM cards:

Dump & Flash

One of the first things we did was dump the ROM, and make a Nandroid image. To do that, we needed to unlock the boot loader. We next installed clockworkmod by running the following: fastboot flash recovery recovery-clockwork- Once we had all that done, we flashed the Modaco ROM image (with HTC Sense) onto the device.

Props & Thanks

Serious Kudos to the ninjas for going all out and making what is probably the coolest “badge” we’ve ever seen. The badge is so well done, that people are still playing with it: One gentleman tweaked a stock ROM image to retain the NinjaTel branding, the badge was used to prank a Radio Shack employee. To encourage the fun, the ninjas have begun posting the source code for their apps over at github. We’re glad we got a chance to play along this year.

Comments disabled

Excuse me, your clouds are leaking

Posted: January 18, 2012 – 10:31 am | Author: , and | Filed under: Articles

I recently started playing around with Gliffy, a nice online diagramming tool that has become quite popular. Gliffy makes sharing your diagrams with the world easy. Unfortunately, many Gliffy users do not realize that they are sharing their diagrams with the entire world. Some quick Google searches revealed a number of entertaining diagrams.

This data ranges from boring to concerning. I held back a few that I felt were not responsible to disclose. At any rate, this highlights the dangers of using “cloud services” and not educating employees about the inherent risks this involves. Also, some of this is just plain laziness from those who probably know better.

After assuring Google I was indeed a human about a dozen times, here are the highlights:

Also, SOPA and PIPA are bad. Please let your representatives know. See: for a nice write up.

@bitexploder, @sorcerer13 and @rossja




This is not the Android Market Security Tool you are looking for

Posted: March 11, 2011 – 12:46 am | Author: , , , and | Filed under: android, android.bgserv, Cryptography, jailbreak, jailbreaking, Mobile Security

We have been actively following and analyzing the spate of Android malware in the Android Market place. The most recent outbreak to light up the blog-o-sphere has been the Droid Dream outbreak. Google’s response to this was to launch a search and destroy mission. They created and pushed a tool to all handsets that were infected with Droid Dream. The Android Market Security Tool (AMST) was pushed to devices that were known to have downloaded and installed infected applications. This tool disinfects the compromised handsets by eradicating all remnants of the Droid Dream trojan. However, what we found quite interesting, is that shortly after the release of AMST, a trojaned version of the AMST appeared and is making the rounds on the internet! (Yo dawg…)

Symantec performed an initial analysis on this piece of malware. They found some interesting links between the malware and a hosted Google code project. This sparked our interest and we decided to get a sample of this malware and perform our own analysis.

The first obvious difference is that the application is requesting very different permissions from the official Google tool, including the ability to change network settings and perform actions, such as send and receive SMS, which can be used fraudulently.

This image illustrations the permissions the application is granted. In particular it is allowed to change the network state, a fact which becomes important when coupled with some of the capabilities of this malware, which we will discuss shortly. The features of this malware are almost identical to the Fake 10086 malware, which has been previously analyzed. When we looked at the disassembled version of the fake AMST, it does appear to be extremely similar to the code found on this Google Code repository. Keep in mind this is malware targeting Android devices using Google’s own code hosting repository system to keep track of the malware development.

So what does this opensource malware do? One thing it does is change your Wireless Application Provider server and your APN. The capture below shows the Java version of this code doing that:

How this capability is used is unclear, but the fact that it is setting your APN, which is essentially where you access the Internet from on the carrier network, is a bit troubling. Additionally, the application has the ability to intercept SMS messages by abusing the RECEIVE_SMS permission. It uses this to filter out SMS from certain numbers so that the malware can receive SMS messages that the user is never aware of. The application then responds and takes some actions based on that SMS. Again, here is the Java code for clarity:

Another interesting feature of this malware is that it hooks the phone call receiver. On any phone call the phone receives it looks for two specific numbers, “10086″ (hence the name Fake 10086 of this Malware’s variant) and “10010″. Both of those numbers are associated with Chinese telecom carriers. The main purpose of all of these lovely “features” is to prevent the user from receiving support related to this malware and keep it on there. The main purpose of the malware is to message a “vedio” service, err, “video” service and rack of SMS text changes. The capture below illustrates some of the URLs and VEDIO love.

private static final String CMWAP = "cmwap";
public static final String CMNET = "cmnet";
// private static final String SERVER_URL = "";
private static final String SERVER_URL = "";
// private static final String VEDIO_URL = "";
private static final String VEDIO_URL_REAL = ";chooseUrl=QQQwlQQQrmw1sQQQpp66.jsp";
private static final Uri uri_apn = Uri.parse("content://telephony/carriers/preferapn");private static final Uri uri_apn_list = Uri.parse("content://telephony/carriers");

Seems pretty ballzy to have the malicious source code posted on a Google Code repository, so we wanted to know more about this aspect. While the author seems to have worked on a few other projects, “mmsbg” seems to be the only thing updated by this account on Google Code recently. We had to wonder what type of jokes this guy would put in the signature that signed the trojaned APK file. A quick “keytool -printcert” of the CERT.RSA file is listed below. Notice the ““.

Let’s see what is up at Seems to be an Android developer with some APK files posted. Wonder what cert was used to sign the “AnReboot Widget” package which you can download from the site.

Notice the matching fingerprints (the MD5 and SHA1 lines). Looks like both the malware sample and the “AnReboot Widget” posted on the Londatiga site are signed with the same private key. What are the chances of that…  we thought.  Turns out Lorenz dropped a private key in a tutorial blog post on signing Android applications and then used it for this app (although his Android Market apps are signed with a different key). Anyhow, it is a good reminder that they’re called “private” keys for a reason. Might be time to generate a new one if you are using ones that have been downloaded by others.

Another point of interest are the whois results for the IP address in the malware:

inetnum: -
netname: CMNET-shanghai
descr: China Mobile Communications Corporation - shanghai company
country: CN
admin-c: HL888-AP
tech-c: HL888-AP
mnt-lower: MAINT-CN-CMCC-shanghai

The malware makes HTTP posts to this address. One final point we will raise is that the malware can also sets the user’s APN to a CMNET APN.

Regardless of the true intent of this malware, the malware authors of the world have clearly struck the first blow in the mobile malware war. This will be a fascinating space to watch as the collision of malware, personal data and mobile devices occurs.

@bitexploder, benn, jross, sid and DIAB1069

Comments disabled


This site is protected with Urban Giraffe's plugin 'HTML Purified' and Edward Z. Yang's Powered by HTML Purifier. 24799 items have been purified.