New mobile threat families and variants rose by 49% from last quarter, from 100 to 149. 136, or 91.3% of these were Android and 13, or 8.7% Symbian. Q1 2013 numbers are more than double that of a year ago in Q1 2012.
While the “walled-gardens” of the iOS and Windows Phone, where apps require approval before sale, have prevented malware threats to develop for the iPhone or Nokia models running those systems, Android threats are increasing and becoming more likely to affect average users.
“I’ll put it this way: Until now, I haven’t worried about my mother with her Android because she’s not into apps,” F-Secure Security Advisor Sean Sullivan said. “Now I have reason to worry because with cases like Stels, Android malware is also being distributed via spam, and my mother checks her email from her phone.”
You can get the entire report here and as you read through it, listen to our Chief Research Officer Mikko Hypponen and Sean Sullivan walk through the report in this exclusive preview. (Sorry, there is a odd echo for the first few minutes of the recording.)
Here’s a look at profit-motivated threats. Is anyone surprised that mobile malware authors are mostly motivated by money?
As far as the types of threats our Labs is seeing, Trojans continue to dominate:
How to balance between privacy and crime fighting? That’s one of the big questions now when we are entering the digitally connected era. Our western democracies have a set of well-established and widely accepted rules that control what authorities can and can’t do. One aspect of this has been in the headlines lately. That’s your right to “plead the Fifth”, as the Americans say. Laws are different in every country, but most have something similar to USA’s Fifth Amendment. The beef is that “No person … shall be compelled in any criminal case to be a witness against himself,…”. Or as often expressed in popular culture: “You have the right to remain silent.” With more fancy words, protection against self-incrimination. What this means in practice is that no one can force you to reveal information if authorities are suspecting you of a crime. You have the right to defend yourself, and refusal to disclose information is a legal defense tactic. But the police can search your home and vehicles for items, if they have the proper warrant, and there’s nothing you can do to stop that. In short, the Fifth Amendment protects what you know but not what you have. Sounds fair. But the problem is that there was no information technology when these fundamental principles were formed back in 1789. The makers of the Fifth Amendment, and similar laws in other countries, could not foresee that “what you know” will expand far beyond our own brains. Our mobile gadgets, social media and cloud services can in the worst case store a very comprehensive picture of how we think, whom we have communicated with, where we have been and what we have done. All this is stored in devices, and thus available to the police even if we exercise our right to remain silent. Where were you last Thursday at 10 PM? Do you know Mr John Doe? What's the nature of your relationship with Ms Jane Doe? Have you purchased any chemicals lately? Do you own a gun? Have you traveled to Boston during the last month? Have you ever communicated with email@example.com? These are all questions that an investigator could ask you. And all may still be answered by data in your devices and clouds even if you exercise your right to remain silent. So has the Fifth Amendment lost its meaning? Would the original makers of the amendment accept this situation, or would they make an amendment to the amendment? The situation is pretty clear for social media and cloud storage. This data is stored in some service provider’s data center. The police can obtain a warrant and then get your data without any help from you.(* Same thing with computers they take from your home. The common interpretation is that this isn’t covered by the Fifth Amendment. But what if you stored encrypted files on the servers? Or you use a device that encrypts its local storage (modern Androids and iPhones belong to this category). The police will in these cases need the password. This is something you know, which makes it protected. This is a problem for the police and countries have varying legislation to address the problem. UK takes an aggressive approach and makes it a crime to refuse revealing passwords. Memorized passwords are however protected in US, which was demonstrated in a recent case. Biometric authentication is yet another twist. Imagine that you use your fingerprint to unlock your mobile device. Yes, it’s convenient. But it may at the same time reduce your Fifth Amendment protection significantly. Your fingerprint is what you are, not what you know. There are cases in the US where judges have ruled that forcing a suspect to unlock a device with a fingerprint isn’t in conflict with the constitution. But we haven’t heard the Supreme Court’s ruling on this issue yet. So the Fifth Amendment, and equal laws in other countries, is usually interpreted so that it only protects information stored in your brain. But this definition is quickly becoming outdated and very limited. This is a significant ethical question. Should we let the Fifth Amendment deteriorate and give crime fighting higher priority? Or should we accept that our personal memory expands beyond what we have in our heads? Our personal gadgets do no doubt contain a lot of such information that the makers of Fifth Amendment wanted to protect. If I have the right to withhold a piece of information stored in my head, why should I not have the right to withhold the same information stored elsewhere? Is there really a fundamental difference that justifies treating these two storage types differently? These are big questions where different interests conflict, and there are no perfect solutions. So I pass the question to you. What do you think? [polldaddy poll=9102679] Safe surfing, Micke Image by OhLizz (* It is this simple if the police, the suspect and the service provider all are in the same country. But it can get very complicated in other cases. Let's not go there now as that would be beside the point of this post.
Despite Apple's stringent "walled garden" approach requiring strict approvals of all software that ends up in its App Story, dozens of apps infected with XcodeGhost malware apparently made it through the store and on to millions of users' devices. The malware allows the attackers remote access, which can lead to phishing or further exploitation of vulnerabilities. Our Labs initial take on this incident is that it appears to be another case of "convenience is the enemy of security". Reports suggest developers were using a Trojanized version of Apple's official tool for working on iOS and OS X apps called Xcode. Developers may have used third-party versions of Xcode to avoid long download times. Some developers also have disabled XCode's Gatekeeper, which would've prevented installation of tainted apps, because it takes too long to run, especially on older devices. These not-so secure practices likely led to a rare breach of iOS security. F-Secure Freedome is already blocking the command and control servers used by the infected apps. This will interrupt their ability to work properly or steal information from a Freedome-protected device. You should check to make sure you have not installed any of the infected apps, which include some of the most popular apps in China, and only install apps from developers that have a track record you can trust.
We have written a lot about how companies treat you as an asset. A source of data that can be monetized in a variety of ways. Spotify did recently change their terms and ensured that this topic stays in the headlines. They want to collect information stored on your mobile device, such as contacts, photos and media files. No thanks! My Spotify app plays music just nicely even if it doesn’t have access to pictures and contacts. And their new terms did backfire big time! Spotify's response: Sorry! Spotify is not the only one. A lot of companies are dependent on user data, from Facebook and Google to utility developers. So this is really a significant privacy challenge. But we have privacy legislation. It’s supposed to protect us and set limits for what data Spotify et al can scoop up and how they can use it. Right? Well, that line of defense has unfortunately fallen already. Yes, there is legislation. But it’s your data and you can decide what to do with it. You are free to sign away your rights to it, and that is utilized by many companies. I bet you have signed a lot of user agreements without reading the fine print legalese. That’s where you disable most of the protection the law could have offered. But there is fortunately a second line of defense, and it’s much stronger. Your Spotify app can only upload data it has access to. Mobile operating systems, like iOS and Android, were designed during an era when we already were aware of the privacy threats. They have several security benefits over desktop systems, but the app permissions is definitively one of the most important. In short, it means that apps you install can’t access everything on the device by default. They must ask for permissions, and you can decide what data and functions a certain app shall have access to. This is your last best hope to keep your private data private. So you better learn the importance of app permissions! They are fortunately very easy to use. And you have already used them. After installing you almost always get a prompt telling that the new app want permission to do something. The most important advice is to stop and think at this point! Don’t let these app permissions be just another boring thing you click through. Your last line of privacy defense falls if you do that. Common sense is enough to use app permissions. Just think about what the app is supposed to do. In Spotify I search for music or start a playlist. Neither action is depending on where I am, so the Spotify app has no real need to access my location. An app that helps me call the emergency number is a totally different cup of tea. It can upload my exact location to the operator, and that is as a matter of fact the main reason for implementing it as an app. So it is natural that this app has a legit reason to access my location. And neither of those apps need to paw through my contacts, so any request to access contacts should be denied. This is the kind of thinking you need to learn. iPhone is currently better on app permissions than Android. Android apps declare what they want and you can review the list before installing the app. That sounds great, but is not so good in practice. The main problem is that it is take it or leave it for you. Your only option is to reject the whole list if you dislike one thing the app want to do, which usually means that the app refuse to install. App developers can sneak in a lot of extra permissions because rejecting the list isn’t a true option in most cases. Android app permissions have actually become just like user license agreements, only a few pay any attention to them. iPhone is smarter. Apps install without any questions about permissions. But the system asks the user when the app tries to access restricted content. The app can’t pressure the user to grant unnecessary permissions by threatening to not install at all. And the user has granular control over permissions, it’s not take it or leave it. Every sensitive content or service is handled separately. This is clearly a better approach. Actually so much better that the next Android, Marshmallow, will copy this system. Moral of the story. App permissions is your friend. And you definitively need allies to help protecting your privacy. Safe surfing, Micke [caption id="attachment_8440" align="aligncenter" width="240"] My Facebook permissions. Location is a no-no. And I don't want to shoot pictures from the app. But access to the photos is needed to post shots.[/caption] Pictures: twitter.com and iPhone screenshots