![How To Implement Screen Sharing To Your Android App [2023]. With Code Examples — cover illustration](https://cdn.prod.website-files.com/64e8910adc5a63966a68acea/69e38bbfbcb595eaa2d77d68_64e8910adc5a63966a68ae22_%25D1%2581%25D0%25BA%25D1%2580%25D0%25B8%25D0%25BD%25D1%2588%25D0%25B0%25D1%2580%25D0%25B8%25D0%25BD%25D0%25B3%2520%25D0%25BD%25D0%25B0%2520android.png)
Key takeaways
• MediaProjection + a foreground service + WebRTC ScreenCapturerAndroid. Those are the three ingredients. Skip one and Android 14+ either crashes your app or silently refuses to start the broadcast.
• Android 14 enforces foregroundServiceType="mediaProjection" + the matching runtime permission. Without FOREGROUND_SERVICE_MEDIA_PROJECTION in the manifest, startForeground() throws SecurityException on device boot.
• Android 15+ tightens it further. Apps targeting API 35 must call MediaProjectionManager.createScreenCaptureIntent() every single time; the OS also asks the user to re-confirm on every session.
• Renegotiate the peer connection on track swap. Replacing the camera track with a screen track fires onRenegotiationNeeded; if your signalling layer ignores it, the remote peer never sees the screen.
• Fora Soft has shipped this in production. We use the MediaProjection + SFU pattern on ProVideoMeeting, Nucleus, and TransLinguist. See § Mini case.
Why Fora Soft wrote this playbook
We have been building Android WebRTC clients since 2013 — video-conferencing products, telemedicine clients, on-premise secure-comms apps, interactive classrooms. Screen sharing looks simple in the documentation and breaks in production as soon as Android ships a new minor version. Android 10 required a foreground service. Android 13 added POST_NOTIFICATIONS. Android 14 tied each foreground service type to a matching permission. Android 15 made MediaProjection tokens single-use.
The previous version of this article covered the basic MediaProjection + WebRTC flow. It is still correct but incomplete: it did not show the Android 14/15 crashes every target-SDK bump triggers, the SFU-direct pattern for large rooms, or the on-device audio capture trick that Zoom, Meet, and Discord now all rely on. This rewrite brings the guide up to April 2026, with the pitfalls and the Kotlin we actually ship at Fora Soft.
Android 14 bumped your targetSdk and screen sharing stopped working?
We have migrated MediaProjection pipelines across a dozen video-calling Android apps. Share your Crashlytics and manifest — we will diagnose the missing permission or policy in 30 minutes.
What changed in Android 14 and 15 — the 90-second briefing
Three platform shifts broke code that used to work. Skim these before copying any snippet.
1. Android 14 (API 34) — type + permission enforcement. Your foreground service must declare android:foregroundServiceType="mediaProjection" and your manifest must include <uses-permission android:name="android.permission.FOREGROUND_SERVICE_MEDIA_PROJECTION"/>. Missing either throws SecurityException the instant startForeground() runs.
2. Android 14 — MediaProjection token is single-use. You cannot cache the consent intent and reuse it later. Every fresh broadcast needs a new createScreenCaptureIntent() + user prompt.
3. Android 15 (API 35) — partial-screen capture and prompt re-styling. Users can now share a single app window instead of the whole screen; the system picker is redesigned and asks the user to re-consent on every session. Handle the new Surface lifecycle callbacks to rotate cleanly when the chosen window changes orientation.
If your target SDK is still 33 you are living on borrowed time — Google Play requires API 34 for new releases, and API 35 becomes mandatory later in 2026.
Architecture — the five moving parts
Every production Android screen-share implementation we ship has the same shape. Understanding which component owns which responsibility saves you from rewriting half the app when Android changes another rule.
- MediaProjectionManager & consent intent. Asks the user for permission to capture the screen; returns an
Intentyou hand to WebRTC. - A dedicated foreground service (
ScreencastService). TypemediaProjection, with a persistent CallStyle or progress notification. - WebRTC
ScreenCapturerAndroid. Wraps the consent intent and delivers frames to theVideoSource. - Signalling & renegotiation. Triggers on track swap so the remote peer knows about the new stream.
- Audio path. Optional
AudioPlaybackCapturefor on-device audio (Android 10+) or the microphone for narration.
Step 1 — request MediaProjection consent
Use the ActivityResult API. startActivityForResult is deprecated and breaks with Compose-first navigation.
private val screenPermissionLauncher = registerForActivityResult(
ActivityResultContracts.StartActivityForResult()
) { result ->
if (result.resultCode == Activity.RESULT_OK && result.data != null) {
startScreencastService(result.data!!)
} else {
// user declined; fall back to camera-only UX
}
}
fun requestScreenCapture() {
val manager = getSystemService(MEDIA_PROJECTION_SERVICE) as MediaProjectionManager
screenPermissionLauncher.launch(manager.createScreenCaptureIntent())
}
Crucial subtlety for Android 14+: do not attempt to reuse the Intent for a second session. The system invalidates the token after the first use; every broadcast needs a fresh prompt.
Step 2 — manifest and ScreencastService
<uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
<uses-permission android:name="android.permission.FOREGROUND_SERVICE_MEDIA_PROJECTION" />
<uses-permission android:name="android.permission.POST_NOTIFICATIONS" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<service
android:name=".ScreencastService"
android:foregroundServiceType="mediaProjection"
android:exported="false"
android:stopWithTask="true" />
class ScreencastService : Service() {
override fun onBind(intent: Intent?): IBinder? = null
override fun onStartCommand(intent: Intent?, flags: Int, startId: Int): Int {
val notification = ScreencastNotificationBuilder(this).build()
ServiceCompat.startForeground(
this,
NOTIFICATION_ID,
notification,
ServiceInfo.FOREGROUND_SERVICE_TYPE_MEDIA_PROJECTION,
)
return START_STICKY
}
}
Two details save days later. First, use ServiceCompat.startForeground with the type argument so the compat library passes the right flag on every API level. Second, keep the notification build synchronous — never do I/O before startForeground or you will trip the 5-second ForegroundServiceDidNotStartInTimeException.
Step 3 — wire ScreenCapturerAndroid into WebRTC
fun startScreenTrack(permissionData: Intent): VideoTrack {
val surfaceTextureHelper = SurfaceTextureHelper.create(
"ScreencastThread", eglBase.eglBaseContext,
)
val mediaProjectionCallback = object : MediaProjection.Callback() {
override fun onStop() { stopScreencastService() }
}
val capturer = ScreenCapturerAndroid(permissionData, mediaProjectionCallback)
val source = peerConnectionFactory.createVideoSource(/* isScreencast = */ true)
capturer.initialize(surfaceTextureHelper, context, source.capturerObserver)
val metrics = context.resources.displayMetrics
capturer.startCapture(metrics.widthPixels, metrics.heightPixels, 24) // 24 fps is plenty
return peerConnectionFactory.createVideoTrack(SCREEN_TRACK_ID, source)
}
isScreencast = true tells the encoder to use content-type=screen settings: lower I-frame frequency, higher quality for motion-free regions, and an RTP simulcast layer tuned for text. Leaving it false gives you a smeary broadcast that looks worse than the camera stream it replaced.
Step 4 — renegotiate the peer connection
When you swap the camera track for the screen track, WebRTC flags the peer connection as dirty and fires onRenegotiationNeeded. Your signalling layer has to rerun the offer-answer exchange, or the remote peer still sees the camera track frozen.
val peerObserver = object : PeerConnection.Observer {
override fun onRenegotiationNeeded() {
scope.launch {
val offer = peerConnection.createOfferSuspend()
peerConnection.setLocalDescriptionSuspend(offer)
signallingClient.send(SignalingMessage.Offer(offer.description))
}
}
// ... other callbacks
}
If you use RtpSender.setTrack() instead of localMediaStream.removeTrack / addTrack, WebRTC can often skip full renegotiation — the SDP doesn’t change, only the track binding does. That is the pattern used by LiveKit and the 2026 version of the Google sample app.
Need Android + iOS screen share that speaks to the same SFU?
We ship cross-platform screen sharing end-to-end on LiveKit, Janus, and custom SFUs. Describe your call topology and we will scope a delivery plan fast.
Stopping the broadcast cleanly
Three stop paths exist and your code has to handle all three or the notification lingers:
1. In-app stop button. Call capturer.stopCapture(), remove the screen track, swap the camera track back in, then stopSelf() on the ScreencastService.
2. User ends the share from the system status bar. Android fires MediaProjection.Callback.onStop(). Mirror the teardown from path 1 and notify the remote peer via your signalling channel.
3. Call ends entirely. Ensure the service shuts down from the same call-teardown path that releases your peer connection; otherwise the notification outlives the call.
On-device audio capture — the underused piece
Android 10 introduced AudioPlaybackCaptureConfiguration, which lets the screen-share session capture the same audio the device is playing (a YouTube clip, a music app, a podcast). Combined with ScreenCapturerAndroid it gives you full parity with Zoom’s “share sound” toggle.
val config = AudioPlaybackCaptureConfiguration.Builder(projection)
.addMatchingUsage(AudioAttributes.USAGE_MEDIA)
.addMatchingUsage(AudioAttributes.USAGE_GAME)
.build()
val audioRecord = AudioRecord.Builder()
.setAudioPlaybackCaptureConfig(config)
.setAudioFormat(AudioFormat.Builder()
.setEncoding(AudioFormat.ENCODING_PCM_16BIT)
.setSampleRate(48_000)
.setChannelMask(AudioFormat.CHANNEL_IN_STEREO)
.build())
.setBufferSizeInBytes(bufferSize)
.build()
Hook the PCM frames into a custom AudioSource and publish them as a second audio track alongside the screen video. Keep the microphone path alive so the presenter can narrate over the captured audio — mix at the remote peer or on the SFU.
Performance and encoder choices
Three levers control CPU, battery, and picture quality:
1. Resolution. Capture at device resolution; let WebRTC scale down via simulcast. On a Galaxy S23 that means 1080p × 24 fps at source, with 540p and 360p simulcast layers for the SFU.
2. Codec. H.264 hardware encoder for broad compatibility. VP9 or AV1 on devices that support them for better screen-text quality — WebRTC negotiates automatically when both peers agree.
3. Frame rate. 15 fps for slides, 24 fps for general content, 30 fps for games/video. Higher rates drain battery fast and rarely help.
Always call peerConnectionFactory.createVideoSource(isScreencast = true) — the screencast flag flips the encoder into content-type=screen mode, which keeps text sharp and cuts bitrate by ~25% on static slides.
Publishing the screen as a second track to your SFU
Small 1-1 calls can survive a track swap. Rooms with 10+ participants should publish the screen as a separate SFU track instead of replacing the camera. Reasons:
- Participants can decide whether to display camera or screen — or both in a tiled grid.
- Renegotiation cost scales with room size when you swap tracks; keeping a dedicated screen track avoids a SDP churn per participant.
- Mobile data savings: SFUs with simulcast can downgrade the screen layer independently when the viewer is on cellular.
LiveKit, Agora, Janus, Daily, 100ms, and mediasoup all accept a second RtpSender from the same peer connection with different MediaStreamTracks. Label the screen track with a clear trackId (e.g. screen_presenter_123) so the UI can distinguish it from camera tiles.
Mini case — screen share for an on-premise comms platform
On Nucleus, an on-premise comms product used in regulated environments, we had to ship Android screen sharing for Samsung Galaxy A-series devices running Android 13 and 14 in mixed fleets. The original implementation replaced the camera track, triggered full SDP renegotiation, and crashed on any device where the build targeted API 34 after a firmware bump.
Our three-week fix: added FOREGROUND_SERVICE_MEDIA_PROJECTION and POST_NOTIFICATIONS, moved to ServiceCompat.startForeground with the type argument, switched to publishing the screen as a second track on the SFU, and added the AudioPlaybackCapture path so users can share a training video with sound. Crash rate on API 34 devices went to zero in the first 10 days; median time-to-first-frame after the picker tap dropped from 3.1 s to 1.4 s. Total engineering effort: roughly 140 hours including QA across seven device models, accelerated with our Agent Engineering workflow. Want the same triage for your app? Book a 30-min review.
Permissions and policy matrix (Android 13/14/15)
| Item | Android 13 | Android 14 | Android 15 |
|---|---|---|---|
FOREGROUND_SERVICE |
Required | Required | Required |
FOREGROUND_SERVICE_MEDIA_PROJECTION |
— | Required | Required |
foregroundServiceType="mediaProjection" |
Required | Required | Required |
POST_NOTIFICATIONS |
Runtime prompt | Runtime prompt | Runtime prompt |
| MediaProjection token reuse | Allowed | One-shot (re-consent required) | One-shot + partial-screen option |
| Partial-app-window sharing | — | — | User-selectable |
A decision framework — pick the right pattern in five questions
1. 1-1 call or group? 1-1 → track swap is fine. Group → publish screen as a second SFU track.
2. Do presenters share audio? Yes → AudioPlaybackCapture + a second audio track. No → skip.
3. What’s your content? Slides → 15 fps, 1080p. Demo video or game → 30 fps, let simulcast handle downscaling. Text-heavy IDE → VP9/AV1 if both peers support it.
4. Which Android minimum? API 26 is the realistic floor. API 29+ for AudioPlaybackCapture. API 34 mandatory for Play new releases.
5. Do you have a WebRTC SFU already? Yes → done. No → consider LiveKit / mediasoup before building screen share on peer-to-peer — it won’t scale past 4 participants.
Five pitfalls we keep finding in audits
1. SecurityException: Media projections require a foreground service of type FOREGROUND_SERVICE_TYPE_MEDIA_PROJECTION. You forgot to add the permission or the foregroundServiceType attribute. Do both.
2. Starting the foreground service from the background. Android 12+ throws ForegroundServiceStartNotAllowedException. Trigger the service from the user-gesture activity that also launches the consent picker, or from a high-priority FCM callback.
3. Passing isScreencast = false to createVideoSource. You’ll waste 25–40% bandwidth on static slides and text will look blurry. Flip it on.
4. Caching the MediaProjection intent across sessions. Worked on Android 13, throws on 14. Request fresh permission every broadcast.
5. Ignoring MediaProjection.Callback.onStop. The user can end the share via the status bar; if your service doesn’t tear down, the notification lingers and Android may flag your app. Always wire onStop to stopSelf().
KPIs — what to measure after shipping
Quality KPIs. Time-to-first-frame after consent (target < 1.5 s), crash rate of ForegroundService*Exception per 1,000 sessions (target 0), and frozen-screen rate (target < 0.5%).
Business KPIs. Screen-share adoption per call (baseline vs post-launch), paid-plan conversion lift when the feature is gated, and user-retention delta in sessions that included a screen share.
Reliability KPIs. Successful broadcast teardown rate (target > 99%), median battery drain during a 30-minute broadcast (benchmark < 8% on a Pixel 8), and on-device audio-capture success rate (target > 97% on devices targeting Android 11+).
When not to ship Android screen sharing
1. Peer-to-peer only, 3+ participants. Screen share multiplies media paths; without an SFU, uplink bandwidth on a mid-tier phone collapses after the third viewer. Move to a media server first.
2. Strict compliance environments without MediaProjection approval. Some MDM profiles disable MediaProjection entirely; know this before you promise the feature. A server-rendered document sharing flow may be the right alternative.
3. Android TV or kiosk devices. MediaProjection is available but often not useful — the screen is the product. Ship in-app slides instead.
Planning an Android 14/15 screen-share migration?
We have done it for telemedicine, video-calling, and regulated comms apps. Send your manifest and we will scope a fixed delivery.
FAQ
Why does my app crash with SecurityException on Android 14 the moment I start the screen share?
Android 14 requires two things: the FOREGROUND_SERVICE_MEDIA_PROJECTION permission in the manifest and android:foregroundServiceType="mediaProjection" on the <service>. The manifest declaration and the startForeground() type argument must match exactly.
Can I keep the MediaProjection consent and use it for a second broadcast later?
Not on Android 14+. The system invalidates the token after the first broadcast ends. Every new screen-share session must call MediaProjectionManager.createScreenCaptureIntent() and re-prompt the user.
Should I replace the camera track with the screen track, or add it as a separate track?
Replace for 1-1 calls where bandwidth is tight. For group calls on an SFU, publish the screen as a second track. Participants can then display camera and screen simultaneously, and simulcast can downscale the screen layer for viewers on cellular.
Why does my shared screen look blurry, especially for text?
You almost certainly called createVideoSource(false). Passing true flips WebRTC into content-type=screen mode: fewer keyframes, tuned bitrate, sharper text. You can also bump the simulcast high layer to 1080p and drop the framerate to 15 fps for slide-heavy content.
How do I capture audio from the device (not the mic) during screen sharing?
Use AudioPlaybackCaptureConfiguration (Android 10+) on the same MediaProjection token, build an AudioRecord with the config, and push the PCM frames into a custom WebRTC audio track. This is how “share audio” works in Zoom, Meet, and Discord on Android.
The remote peer never sees the screen after I swap the track — what’s wrong?
You are missing the renegotiation step. Swapping tracks fires onRenegotiationNeeded; your signalling code must recreate the SDP offer and send it to the remote. Alternatively, use RtpSender.setTrack() which often avoids renegotiation entirely.
Does Android 15’s partial-screen sharing break my existing code?
Not if your code responds to VirtualDisplay dimension changes. When the user picks a single app window, the Surface you render to may resize. Let ScreenCapturerAndroid adapt and emit onCapturerStarted — it handles the new sizes automatically if you are on WebRTC M127+.
How long does it take to ship production-quality Android screen share?
On an existing WebRTC Android client, budget 7–10 engineering days for MediaProjection integration, foreground service wiring, and QA across three or four device classes. Add another 3–5 days for AudioPlaybackCapture and SFU multi-track publishing. Fora Soft typically lands the full scope in roughly 10 calendar days with Agent Engineering-accelerated delivery.
What to read next
ANDROID PLATFORM
Foreground Service and Deep Links on Android
The Android 14 type + permission rules that your screen-share service has to follow.
NOTIFICATIONS
Custom Android Call Notifications
CallStyle notifications that pair with your foreground screen-share service.
iOS
Implement Screen Sharing in an iOS App
The ReplayKit Broadcast Upload Extension counterpart to this Android guide.
MEDIA ARCHITECTURE
P2P vs MCU vs SFU — Which Media Server Fits?
The media topology that dictates how you publish the screen track.
Ready to ship Android screen sharing that survives the next platform bump?
Android screen sharing in 2026 is the sum of four decisions: request consent the modern way, run a foreground service of type mediaProjection with the matching runtime permission, wire ScreenCapturerAndroid into a WebRTC video source (with isScreencast = true), and let your SFU own the track so group calls scale. Add AudioPlaybackCapture if the presenter shares sound.
If you want a team that has shipped this pattern in production across WebRTC, telehealth, and regulated-comms products, Fora Soft has the Kotlin templates, the Android 14/15 compliance checklist, and the QA matrix ready.
Book a 30-minute review of your Android screen-share design?
We will critique your manifest, service, WebRTC pipeline, and SFU track strategy, then hand back a concrete punch list. Agent Engineering-accelerated.



.avif)

Comments