I Hacked My Own Flutter App: What I Found Shocked Me
I Hacked My Own Flutter App: What I Found Shocked Me
Pinki Singh · 6 min read · Dec 27, 2025
I recently stepped into the world of Flutter app security, and what I discovered completely changed the way I think about mobile development.
Seeing my own app’s data leaking and uncovering vulnerabilities I never knew existed was a wake-up call. I realized something crucial:
Writing code is not enough. Security must be a first-class concern.
Today, users care deeply about privacy and data protection. As developers and builders, we carry the responsibility of safeguarding user data. One weak link can compromise everything.
What Is Mobile Security?
Mobile security is about protecting three critical assets:
- Your API keys and secrets
- Your backend endpoints
- Your users’ data
If attackers can reverse-engineer your app, all three are at risk. What I learned is basic, but it was more than enough to open my eyes.
The Experiment: Reverse-Engineering My Own App
I decided to reverse-engineer my Flutter app called “Wordly”. Here’s exactly how I did it.
The Setup
First, generate a release APK. This is important because it represents what real users receive.
flutter build apk --release
Many junior developers assume release builds are fully secure by default. I used to believe the same—until this experiment.
I used two techniques:
- JADX — to inspect the Android shell
- String extraction — to analyze the Flutter binary
Stage 1: JADX (The Android Shell)
JADX allows us to inspect the Android container. It exposes:
- AndroidManifest.xml
- Permissions
- Signing configuration
- Native plugins
Step 1: Install JADX
- Windows: Download jadx-gui-with-jre from GitHub Releases
- macOS/Linux:
brew install jadxor download the ZIP
Step 2: Open the APK
Drag and drop app-release.apk into the JADX GUI.
Step 3: What I Investigated
Permissions Audit
I inspected AndroidManifest.xml and audited every permission.
To my surprise, I found permissions I never added myself.
For example, if a simple dictionary app requests
ACCESS_FINE_LOCATION, it’s likely introduced by a third-party library.
This is a silent privacy risk.
Plugin Code Inspection
While Dart code isn’t visible here, native plugins are. Sometimes, plugin authors leave comments, URLs, or internal logic exposed.
Shocking Findings
-
Fake Release Build: My APK was signed with a Debug key
(
CN=Android Debug), making it insecure and unpublishable. -
Hidden Permissions: I discovered
android.permission.WAKE_LOCK, which drains battery and I didn’t knowingly add.
Stage 2: Flutter Binary Analysis
Instead of advanced tools like reFlutter, I started with a simple but powerful technique: string extraction.
mkdir wordly_hack
unzip build/app/outputs/flutter-apk/app-release.apk -d wordly_hack
strings wordly_hack/lib/arm64-v8a/libapp.so | grep "https"
What I Looked For
- API keys
- Backend URLs
The result? My backend endpoints were visible in plain text. I reverse-engineered static app data in seconds.
The Failed Defense: Obfuscation
I rebuilt the app using Flutter obfuscation, assuming it would protect everything.
flutter clean
flutter build apk --release --obfuscate --split-debug-info=./debug_info
I repeated the same string extraction attack—and the API URL was still there.
Why?
- Obfuscation protects logic, not data
- Strings are treated as display content, not secrets
The Hard Truth
You cannot fully hide secrets on a mobile device.
If an app needs a key to function, that key exists on the device—and can be extracted. The goal is not perfection, but resistance.
Better Security Strategies
1. Backend Proxy (Recommended)
Route all API calls through your own backend. Attackers only see your server, never the real API key.
2. Runtime String Encryption
Encrypt strings (XOR/Base64) and decrypt them only at runtime. This defeats basic automated attacks.
The Golden Rule
Treat every API endpoint as a secret. Leaving URLs in plain text is like leaving your house key under the doormat.
- Stops automated scanning attacks
- Protects infrastructure from abuse
- Reduces financial and performance risks
Conclusion
This experiment fundamentally changed how I view development. “If it works” does not mean “it’s safe.”
- Release builds are not automatically secure
- Obfuscation protects logic, not secrets
- Plain-text URLs are a gift to attackers
- You don’t need to be a hacker to reverse-engineer apps
The tools are free. The process takes minutes. If I could hack my own app, anyone can.
As developers, we must start thinking like attackers—before they do.
Comments
Post a Comment