Media Server
FAQ

Is there a user forum for Unreal Media Server?
Post questions to our current forum
This is our older, read-only forum

Where can I find video tutorials about Unreal Media Server?
UnrealStreaming channel on YouTube
Our "SDK and Tutorials" web page

Are there any Streaming Providers/CDNs offering hosting of Unreal Media Server?
We have partnered with StreamGuys who offer hosting packages based on their CDN.
You can also run Unreal Media Server on Amazon EC2 Windows Server instances.

How to play live streams on tablets / mobile phones?
There is a number of ways to play on iOS, Android and other mobile devices:
1. HLS streaming: navigate to .m3u8 file in your browser.
2. Flash-enabled browsers such as Photon Flash browser.
3. Unreal HTML5 player in Chrome mobile browser.
4. Play rtmp:// and ums:// links with media player app capable of playing RTMP or UMS streams.
Recommended mobile player app: mPlayer mobile app.

What is the exact format of rtmp:// and ums:// links for Unreal Media Server?
Refer to Installation and Configuration page.

How to play live streams in web browsers with Unreal HTML5 player?
Make sure your live stream is H.264/AAC encoded. The distance between key-frames must be 500-2000 ms. On your webpage hosting Unreal HTML5 player specify an alias of live broadcast configured on the Media Server. You can use HTTP secure streaming if Unreal Media Server has an SSL certificate configured. Chrome, IE, Edge, Opera, Safari and Firefox browsers are supported on Windows, MAC and Android devices; iOS devices and IE on Windows 7 are not supported.
Sample pages can be found in our SDK package and Demo pages.

Unreal HTML5 player embedded in my webpage shows elements in wrong places.
If your webpage defines styles with css style sheets then the Unreal HTML5 player element will inherit these styles and may not display certain items correctly. In this case you need to embed the Unreal HTML5 player in iframe. So your webpage will have an iframe that refers to another webpage running the player, for example: <iframe src="http://mywebsite/player1.html" frameborder="0" scrolling="no" width="900" height="700"></iframe>

How to play with Flash Player?
Make sure content encoding is H.264/AAC/MP3. Download our Flash Player and host it on your website. You can also use any other Flash player such as JW player, Flowplayer etc. Provide html page hosting the player. In this page, supply RTMP link that points to live broadcast or file.
For example, if your Media Server is on 192.168.0.100, and you have a live broadcast with alias "webcam", the link looks like rtmp://192.168.0.100:5119/live/webcam. For file "test.mp4", residing in "mediaroot" virtual folder, the link looks like rtmp://192.168.0.100:5119/vod/mediaroot/test.mp4.
Sample pages can be found in our SDK package and Demo pages.

How to play with Windows Media Player?
Use menu item File->Open URL. Type mms link referring to live broadcast or file configured on Media Server. Note that the content encoding must be WMV/MPEG-4/WMA/MP3.
For example, right after the installation of Media Server you can type
mms://localhost:5119/mediaroot/test.avi
If you have a live broadcast with alias "webcam", you need to type mms://localhost:5119/webcam
When playing remotely, you need to specify IP address of server machine: mms://192.168.0.100:5119/webcam

How to play on MAC?
Play RTMP links with Flash Player or VLC player. Play with Unreal HTML5 player. Or, download and install free Windows Media Components for QuickTime, recommended by Microsoft. Then use your QuickTime player exactly the same way described in the previous answer. You can also play via HLS, navigating to .m3u8 file with Safari browser.

How to play on iOS and Android devices with mPlayer mobile app?
The mPlayer mobile app can play live and on demand streams from Unreal Media Server using RTMP and UMS-TCP unicast protocols. The content encoding must be H.264/AVC1 for video and AAC/MP3 for audio. Specify rtmp:// or ums:// links to play. Note that ums:// links for this player must have all forward slashes instead of backslashes. Working example from our demo server: ums://tcp:65.23.154.147:80/um400

Streaming or transcoding is not working when server runs on Windows Server 2008/2012.
Windows Server 2008/2012 may not have DirectShow runtime enabled. You need to enable "Desktop Experience" feature on your server. For transcoding with Unreal Live Server you may need to install desktop experience decoder update (KB2483177).

What are UMS streaming protocol advantages and what players can play it?

  • Support for low latency, near real-time live streams
  • Built-in user authentication (use internal authentication for that)
  • Ability to encrypt the streaming channel (use UMS-HTTPS delivery for that)
  • Support for multicast streaming
  • Codec independence: content can be encoded with any codec

  • Windows OS: play with our Streaming Media Player or its browser plugin.
    Android and iOS: play with mPlayer mobile app (this mobile player supports UMS-TCP unicast mode only, and H.264/AVC1, AAC/MP3 encoded streams only).

    I want my users to play with your player. Is it a must for them to download and install Unreal Streaming Media Player?
    No, it's not a must, but our ActiveX control or Browser plugin still has to be installed on user's machine. As a media publisher, you can avoid having users install Streaming Media Player by providing a web page that hosts our ActiveX control or Browser plugin. The page should reference the CAB installation file that will be automatically downloaded by a browser. When the page is loaded it will prompt the user to install the control. Refer to the Demo pages hosting ActiveX control.

    I am not able to view video with Firefox/Opera etc. browsers.
    Please install a plugin for those browsers.

    Audio-Video decoder not found when playing with Unreal Streaming Media Player?
    Unreal Streaming Media Player/Browser plugin relies on decoders (codecs) that are installed on your system; Windows 7/8/10 comes with built-in decoders for most compression types. For more decoders: install free ffdshow decoder package.

    How to play live H264/AAC streams in Unreal Streaming Media Player?
    You need H.264/AAC decoders, obviously. Windows 7/8/10 comes with these decoders; pre-7 users are recommended to install a free ffdshow decoder package or free DivX decoder package.

    What kind of DRM does Unreal Media Server support for different streaming protocols?

  • WebSocket protocol streaming to Unreal HTML5 player: Full DRM can be achieved when using secure WebSockets combined with session-based user authentication.
  • UMS protocol streaming: Full DRM can be achieved when using HTTPS transport combined with internal or session-based user authentication.
  • RTMP (Flash) protocol streaming: partial DRM (authorized access) can be achieved by using session-based user authentication.
  • HLS streaming: Full DRM can be achieved by using AES-128 encryption and serving keys (by your own web app) via HTTPS.
  • How is Unreal HTML5 live streaming different from MPEG-DASH?
    Unlike MPEG-DASH, Unreal Media Server uses a WebSocket protocol for live streaming to HTML5 <video> MSE element in web browsers. This is much more efficient than fetching segments via HTTP requests per MPEG-DASH. Also, Unreal Media Server sends segments of minimal duration, as low as 30 ms. That allows for low, sub-second latency streaming, while MPEG-DASH, like other HTTP chunk-based live streaming protocols, cannot provide low latency live streaming.

    Can I stream files in on-demand mode to Unreal HTML5 player?
    No, Unreal HTML5 player is for live broadcasts only. To stream files on-demand to HTML5 <video> tag, you don't need Unreal Media Server. Just put your file under web server and specify reference to it in <video> source attribute.
    However, you can stream files in live mode to Unreal HTML5 player by creating a live broadcast of "live playlist" type and adding files to that live playlist. Some files, encoded with incorrect timestamps of H.264 B-frames, may not play well in this mode.

    What are Unreal Live Server advantages for live streaming?

  • Near real-time streaming mode. You can use this mode for live conferencing applications.
  • Ability to stream hardware-compressed content, offloading computer CPU from encoding. If you have a capture card or device that does video encoding, the only live encoder that would stream its content without transcoding is Unreal Live Server.
  • Unmanned, automated operation. You configure the system and leave. No waste of bandwidth: streaming is only active while there are viewers/recorders.
    Live encoders such as FMLE/OBS push the stream regardless if anyone watches/records the stream. This is good for broadcasting events but this is unacceptable for a system that needs to start broadcasting anytime when somebody wants to watch/record, such as IPTV, radio, video surveillance, digital signage apps, etc.. Unreal Live Server running as a Windows service will start encoding and streaming when a first viewer/recorder sends request for live video. It will stop encoding and streaming when last viewer/recorder disconnects.
  • Ability to encode with user-supplied codec. Any DirectShow-friendly codec can be configured to encode live audio/video.
  • How to broadcast live audio/video from my iOS/Android device?
    You need to install a live encoder app on your device, capable of publishing live RTMP streams, preferably with H.264/AAC encoding.
    iOS: Broadcast Me; Streamsie.
    Android: Broadcast Me.
    With Media Server configuration program, create a live broadcast of type "Rebroadcast live RTMP/RTSP/MPEG2-TS/MMS stream" and select rtmp:// protocol as described in the following answer.

    What live encoders can push a stream to Unreal Media Server and how should they be configured?

    Supported live encoders capable to push streams to Unreal Media Server:

  • Unreal Live Server. It connects to Unreal Media Server over UMS-TCP protocol and streams via UMS-TCP or UMS-UDP. Create a dynamic live broadcast with Unreal Media Server configuration program.

  • RTMP Flash software and hardware encoders, such as Adobe FMLE, Telestream Wirecast, Open Broadcaster Software, vMix, xSplit, ffmpeg, Digital Rapids TouchStream, NewTek Tricaster, Teradek VidiU, Broadcast Me mobile app and others. These encoders connect and publish RTMP streams to Unreal Media Server. The default port for these connections is 5130; this port can be configured. Create a live broadcast of type "Rebroadcast live RTMP/RTSP/MPEG2-TS/MMS stream", select rtmp:// protocol. Select "Push" option. Specify a password for this broadcast, if authenticated publishing is needed. On the encoder panel, specify H.264 video encoding. Specify rtmp publishing address of this broadcast. For example, if your server machine's IP address is 192.168.1.8, the broadcast alias is "Camera8", then the rtmp publishing address is "rtmp://192.168.1.8:5130/live/Camera8". Refer to screenshot for FMLE and Wirecast; or to screenshot for other encoders. The stream name can be anything, for example "livestream".
    Here is an ffmpeg command line: "ffmpeg -re -i file.mp4 -acodec copy -vcodec copy -f flv rtmp://192.168.1.8:5130/live/Camera8/livestream".

  • MPEG-TS software and hardware encoders. Any encoder capable of sending MPEG2-TS streams over UDP, can push a stream to Unreal Media Server. VLC and ffmpeg are open source encoders that can do it. Configure the encoder to stream to the IP address of a computer where Unreal Media Server runs; specify any arbitrary port which is unused on that computer. With Unreal Media Server configuration program, create a live broadcast of type "Rebroadcast live RTMP/RTSP/MPEG2-TS/MMS stream", select mpts:// protocol and specify the same IP address and port that you specified for the encoder.
  • Supported live encoders capable to serve streams to Unreal Media Server via pull mode, where Unreal Media Server connects to the encoder:

  • Unreal Live Server. Create a static live broadcast with Unreal Media Server configuration program.

  • RTSP software and hardware servers, cameras and encoders. With Unreal Media Server configuration program, create a live broadcast of type "Rebroadcast live RTMP/RTSP/MPEG2-TS/MMS stream", select rtsp:// protocol and specify the RTSP URL. All major brand IP cameras, Orban Opticodec, VLC, Wowza server and any other RTSP-compliant servers are supported.

  • MS-WMSP (MMS) software and hardware encoders. With Unreal Media Server configuration program, create a live broadcast of type "Rebroadcast live RTMP/RTSP/MPEG2-TS/MMS stream", select mms:// protocol and specify IP Address and port of the encoder computer. Windows Media Encoder and VLC are supported. This method is intended to be used with Windows Media encoding and Windows Media / Silverlight / Unreal players.

  • RTMP servers. Create a live broadcast of type "Rebroadcast live RTMP/RTSP/MPEG2-TS/MMS stream", select rtmp:// protocol. Select "Pull" option and specify rtmp URL where to pull the stream from.
  • How to publish a multi-bitrate stream to Unreal Media Server?
    Use RTMP publishing encoders such as FMLE and Wirecast. Refer to instructions found in the answer to previous question, with the following changes:
    For multi-bitrate encoding with FMLE, select several video bitrates and add %i to stream name, for example "livestream%i". If you are publishing 3 streams to a live broadcast named "Camera8", then you automatically have 3 aliases available for playback: "Camera8", "Camera8_2" and "Camera8_3". For multi-bitrate encoding with Wirecast, create a separate live broadcast (in Media Server configuration program) for each bitrate. Each Wirecast profile should stream to a separate live broadcast alias.

    What can I do with "live playlist" type of live broadcast?

  • Server-side switching between different audio/video sources streaming to the same player; you can do it manually, programmatically (using our SDK), or based on schedule.
  • Make sure that your live stream never breaks, even if the camera is not available. If a camera goes down, a "standby" file of your choice will be streamed until the camera is back online.
  • Server-side ad insertion. Stitch advertisement videos into a live stream.
  • Stream a file as a live broadcast.
  • Stream whole folders as a live broadcast. Create your digital library streaming as a live broadcast, providing a TV-like experience. When combining with Timeshift, you also provide a trick-play functionality: pause, resume and seek in the live stream.
  • Live playlist should preferably contain live broadcasts and files encoded with the same codecs and encoding settings. It depends on a player if it can switch on-the-fly to a differently encoded content. Unreal Streaming Media Player, MPEG2-TS players (STBs) and Flash player can switch smoothly to different encodings.
    Unreal HTML5 player and HLS players really need all items in your playlist to have absolutely same encodings.

    How to play live broadcasts in time-shifted mode?
    Live streaming with timeshift is supported with Unreal Streaming Media Player, Flash player (use our Flash player) and our HTML5 player. With Media Server configuration program, right-click on a live broadcast and select "Start Buffering for time-shifted playback". It takes 15 seconds for the buffer to become active. All players playing this live broadcast will display a timeline where you can click to seek back in time, jump to live view again, pause and resume live playback. Once you stop the buffering, the temporary buffer is still available for playback by open players only, for 24 hours. To provide older recorded content for playback, use Unreal Archival Server that records live streams to mp4 and asf files.

    How to stream via HLS (Apple Live HTTP streaming) to iOS and other HLS-enabled devices?
    First of all, you need to make sure that your live encoder provides encoding compatible with HLS: video encoding should be H.264, audio encoding should be AAC or MP3; the distance between key-frames must be no longer than 2000 msec. With Unreal Live Server configuration program, you need to select Manual encoding, choose H.264 codec and set 2000 ms distance between key-frames.
    Secondly, in addition to Unreal Media Server, you need a standard web server for HLS streaming. Any web server is OK. Configure a web folder on your web server and make sure this folder is allowed to serve .m3u8 and .ts files. You may need to create mime type associations for these extensions: .m3u8 to application/x-mpegURL and .ts to video/MP2T
    Also, you must set immediate expiration on your web folder. For IIS web server, open IIS Manager, select your web folder -> HTTP Response Headers -> Set Common Headers -> Expire Web Content Immediately.
    In addition, if your .m3u8 file is going to be consumed by web players such as Flowplayer, you need to put crossdomain.xml file in your web root folder (for IIS, c:\inetpub\wwwroot) and to enable cross-origin access (CORS) for your HLS web folder by adding "Access-Control-Allow-Origin: *" custom HTTP header.
    To begin HLS, open up a "Start HLS Streaming" dialog in Unreal Media Server configurator by right-clicking on a live broadcast. Specify path to configured web folder. Also specify the web URL of this web folder, i.e. the URL that outside users will type in their web browsers to navigate to this web folder. For lowest latency, specify 1 second-long .ts files and 3 .ts files to keep. Click OK. Notice that .ts files start to generate in your web folder and also .m3u8 file appears there. iOS users just need to navigate to this .m3u8 file with their browsers. For example, if your web folder is named "HLS", the public IP Address of your server computer is 123.123.123.123, and you are streaming live broadcast named "evildude", then iOS users need to navigate their Safari browsers to http://123.123.123.123/HLS/evildude.m3u8.
    It is recommended to create a RAM disk and place a web folder for HLS on that disk. That way you save your real hard disks from heavy .ts files creation/deletion associated with HLS.
    You can provide DRM for HLS streams by enabling AES-128 encryption and providing web application that serves AES-128 key files to authorized users over HTTPS.

    How to stream a file or a whole folder via HLS?
    Create a live playlist (live broadcast of a type "live playlist") and add a media file or a virtual folder containing media files to that live playlist. Then follow the instructions from a previous answer.

    How to create live streaming on a web page that will play on any OS and device?
    If latency of live stream is not important for your streaming application (IPTV, event broadcasting) then HLS may be a good solution. Modern players such as JWPlayer and Flowplayer can play HLS in any browser on any device.
    However, if your application needs a low latency streaming (surveillance, conferencing) then HLS is not acceptable and you really cannot create one simple webpage that will play on any device. In this case, you have several playback options:
    1. Unreal HTML5 player will play in all major web browsers on all devices, except iOS and IE on Windows 7.
    2. Flash Player will play in all browsers except iOS and some Androids.
    3. Web page hosting Unreal Streaming Media Player (this web page will work on Windows OS only in all browsers except Chrome).
    4. Non web-based, standalone player apps: VLC, Unreal Streaming Media Player on Windows OS; mPlayer mobile app on iOS and Android.

    How to create an adaptive bitrate streaming?
    Unreal Media Server supports adaptive streaming for HLS, Flash and Unreal HTML5 players. So you need to publish a multi-bitrate live stream to Unreal Media Server, or have a file encoded in several versions, each one corresponding to separate bitrate. Once live multi-bitrate is being published with FMLE, start HLS using Media Server configuration program. The .m3u8 file that is generated, contains references to multiple bitrates, so it is ready for adaptive streaming to HLS-enabled player/device. If you publish multi-bitrate stream with Wirecast, you need to create a separate live broadcast (in Media Server configuration program) for each bitrate. In this case, start HLS for each live broadcast and then create a master .m3u8 file that contains references for individual bitrate .m3u8 files.
    For adaptive RTMP, provide live broadcast aliases that receive multi-btrate streams (or alias_ variations, as described before) or file names for each bitrate, in html page hosting a capable Flash player, such as Flowplayer.
    For adaptive HTML5 playback, provide a list of aliases to Unreal HTML5 player. Refer to demo pages for examples.

    How can other streaming systems pull live streams from Unreal Media Server?
    They can pull live streams via RTMP(Flash) and MPEG2-TS protocols. Software such as ffmpeg, XSplit, VMix, Wowza streaming server, and many others are supported.

    Can Unreal Media Server push RTMP streams to CDNs?
    No. You can use ffmpeg to pull an RTMP stream from Unreal Media Server and publish it to CDNs.

    How to stream to Silverlight player via MS Smooth streaming protocol?
    For this to work, you need to run IIS7 web server with Media Services extension installed. Using IIS manager tool, create Live Smooth streaming publishing point and give it the same name your live broadcast have. After this is done, open up a "Start MS Smooth Streaming" dialog in Unreal Media Server configurator by right-clicking on a live broadcast. Specify URL of publishing point and click OK. For optimal performance, make sure that live video encoding has a distance between video I-frames in the range of 1-2 seconds. Follow this tutorial on configuring a publishing point and providing a simple Silverlight player. This tutorial uses MS Expression Encoder to stream to publishing point; Unreal Media Server does the same.

    What IP cameras are supported?
    Any RTSP IP camera can serve as a live source to Unreal Media Server; the Media Server will connect and pull RTSP stream from an IP camera.
    For older IP cameras that only support JPEG, use IPCamSourceVideo component available for download. It receives video from all major brands IP cameras but does not support audio.

    How can I stream my RTSP IP camera to multiple online HTML5/Flash/HLS viewers?
    If your IP camera provides H264/AAC encoding then Unreal Media Server can ingest the RTSP stream from it and output the original hardware-encoded content via WebSocket, RTMP/RTMPT and HLS protocols to multiple concurrent HTML5/Flash/HLS viewers. With Media Server Configurator, create a live broadcast of type "Rebroadcast live RTMP/RTSP/MPEG2-TS/MMS stream", select rtsp:// protocol and specify RTSP URL of your IP camera, as well as username and password, if needed. Be sure to specify RTP transport protocol that camera supports.
    Then start HLS and/or create a web page with Flash player and host it in your web server. Sample pages can be found in our SDK package and Demo pages.

    How can I transcode on-the-fly MPEG-4/G.711 live stream from my RTSP IP camera to H.264/AAC live stream?
    You can do it with Unreal Live Server and our RTSP DirectShow source filter, available for download. Note that you only need to do that if your player is Flash, Silverlight or iOS devices, because these players cannot play MPEG-4/G.711 content. Otherwise, you don't need transcoding and you don't need Unreal Live Server: use Unreal Media Server directly, as described in previous answer.

    Internet-based Unreal Media Server cannot connect to my LAN-based RTSP IP camera; I don't control the NAT router and unable to forward ports.
    You need to run Unreal Live Server and our RTSP DirectShow source filter, available for download, on your LAN where your IP camera is located. Using Live Server configuration program, configure a live source to pull an RTSP stream from your IP camera, and specify "stream hardware compressed content", so that the Live Server will stream out original, camera-encoded stream. Of course, you can use software transcoding, if you need, as described in previous answer.
    Now, create a dynamic live broadcast on the Internet-based Unreal Media Server. Now, using Live Server configuration program, connect your live source to this newly created dynamic live broadcast.

    How can I transcode on-the-fly Windows Media encoded MMS streams and also MPEG Transport streams from encoding hardware?
    Unreal Live Server has a built-in support for ingesting and transcoding MMS streams; for MPEG Transport streams you need to install our MPEG2-TS/HLS DirectShow source filter, available for download.

    How to stream to a Set-Top box via MPEG2-TS protocol?
    Open up a "Start MPEG2-TS Broadcasting" dialog in Unreal Media Server configurator by right-clicking on a live broadcast. Specify IP address of your Set-Top box, and any port. Most STBs, like Amino, need raw UDP MPEG2-TS, not wrapped in RTP. Then connect to the same IP address with your Set-Top controller. Follow STB instructions on how to do that. For example, for Amino STB that has IP address of 192.168.1.5, you need to start streaming to 192.168.1.5:1234. Using USB keyboard attached to Amino STB, open up Opera browser inside STB, and type udp://192.168.1.5:1234. To stream in multicast mode to multiple STBs, specify multicast address, for example 225.1.1.1:1234. Then type igmp://225.1.1.1:1234 in Amino Opera browser's address bar.

    Can I stream via MPEG2-TS over HTTP protocol?
    To stream MPEG2-TS over the Internet to remote STBs, UDP delivery may not be acceptable due to packet loss, so some programs like VLC allow wrapping MPEG2-TS in HTTP protocol. Unreal Media Server doesn't support that. However, there is an easy and efficient workaround: install additional instance of Unreal Media Server close to your STBs, preferably in the same LAN where your STBs are located. Then create delegate live broadcast to pull a stream from your original Unreal Media Server over the Internet. Now the new Media Server can use UDP MPEG2-TS delivery to your STBs.

    Unreal Media Server segments UDP MPEG2-TS stream into 4 IP packets constructed of 7 raw MPEG2 packets. Can it segment the stream into just 1 IP packet instead of 4?
    Create DWORD registry value named MPEG2TSChunkMTUs, under HKLM\SOFTWARE\UNREAL\SERVER (or HKLM\SOFTWARE\WOW6432NODE\UNREAL\SERVER), and set it to 1 (the valid range is 1 to 8).

    I can't stream ASF/WMV files; file streaming is not working via UMS-RTP or MPEG2-TS.
    Your system misses Windows Media Format runtime which is required on the server machine for these streaming modes; you need to install latest Windows Media Player or Media Format Runtime. On Windows Server 2008/2012 you need to enable "Desktop Experience" feature.

    Can I stream a file as a live source in a loop mode?
    The best way to do it is to create a "live playlist" type of live broadcast and add a file to this live playlist.
    Another way of doing this is to stream a file via MPEG2-TS; it will work for files containing H264/MPEG4/AAC/MP3 content. With Media Server Configurator, right-click on virtual folder and start MPEG2-TS broadcasting of that file, specifying 127.0.0.1:6789 as an address (6789 is an arbitrary port and can be any unused port). Then create a live broadcast of type "Rebroadcast live RTMP/RTSP/MPEG2-TS/MMS stream", choose mpts:// as a protocol and specify the same address.

    Unreal Live and Archival servers record mp4 files with "moov" atom in the end of the file. I need to transfer the "moov" atom to the beginning, for faster delivery on YouTube.
    Use the following utilities: MP4 FastStart or MP4Box to relocate the "moov" atom.

    Is there any software for parsing/analyzing the Unreal Media Server log format?
    Sawmill analytical platform supports our log format.
    StreamAnalyst is a web-based service that also supports our log format.

    Live Server Config doesn't recognize my card, or I am able to choose 320x240 size only.
    The driver of your card must fully support DirectShow and provide a full list of supported video resolutions via appropriate DirectShow interface.

    While configuring the Archival Server, I am asked to enter a password for media server?
    The Archival Server connects to the Media Server which uses this password for authentication. You have to setup the same password on the Properties page of Media Server Configurator.

    RTP Multicast packets are sent with TTL (time to live) of 128. How can I change the TTL?
    Create DWORD registry value named TTL, under HKLM\SOFTWARE\UNREAL\SERVER (or HKLM\SOFTWARE\WOW6432NODE\UNREAL\SERVER), and set an appropriate TTL for you.

    I can't change Multicast Group IP address configured with Media Server configuration tool.
    You need to change both IP address and port and make sure no other configured resources use the same IP address/Port. When you install multiple Media Servers on your LAN, make sure all resources have different multicast group addresses.

    Unreal Live Server can't stream live video from computer that does not have interactive desktop open (no user is logged in).
    This can happen if the video capture card that is used has a Video Port. The purpose of the port is to minimize CPU utilization for video capturing rendering. The problem is that the Video Port requires interactive desktop to work correctly. Gainward GeForce-4 graphics cards are known to require the interactive desktop for video capturing.












       2003-2017 Unreal Streaming Technologies. All rights reserved.