live555編譯、播放示例
阿新 • • 發佈:2019-01-01
最近被安排搞onvif,onvif的視訊傳輸,就是使用live555做伺服器,使用其提供的URL。所以live555也得去了解學習。本文簡單介紹live555的編譯,然後在原有例程上給出一個示例。
1、編譯
$ ./genMakefiles mingw
$ export CC=gcc
$ make
類似地,在linux編譯方法如下:
$ ./genMakefiles linux
$ make
編譯完成後,testProgs目錄會生成很多測試程式,在不修改任何程式碼情況下,可執行這些程式進行測試。以testH264VideoStreamer為例,將H264視訊檔案放到該目錄,改名為test.264,執行testH264VideoStreamer(在mingw環境編譯得到的名稱是testH264VideoStreamer.exe)。再在VLC開啟網路串流,輸入地址rtsp://ip/testStream,如:rtsp://192.168.1.100:8554/testStream。
預設情況,該示例程式就是使用test.264檔案。如需要修改播放的檔案,則要修改原始碼檔案testH264VideoStreamer.cpp。如果需要再次編譯,直接在testProgs輸入make即可。
2、示例
下面在testH264VideoStreamer.cpp工程基礎上新增單播功能,該功能模組參考testOnDemandRTSPServer工程。程式碼如下:/** 本程式同時提供單播、組播功能。基於testH264VideoStreamer程式修改,另參考testOnDemandRTSPServer。 注: 單播:重開VLC連線,會重新讀檔案。無馬賽克 組播:重開VLC連線,會繼續上一次的位置往下讀檔案。每次連線時,出現馬賽克,VLC出現: main error: pictures leaked, trying to workaround */ #include <liveMedia.hh> #include <BasicUsageEnvironment.hh> #include <GroupsockHelper.hh> UsageEnvironment* env; char inputFileName[128] = {0}; // 輸入的視訊檔案 H264VideoStreamFramer* videoSource; RTPSink* videoSink; Boolean reuseFirstSource = False; void play(); // forward void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms, char const* streamName, char const* inputFileName); int main(int argc, char** argv) { strcpy(inputFileName, "test.264"); // 預設值 if (argc == 2) { strcpy(inputFileName, argv[1]); } printf("Using file: %s\n", inputFileName); // Begin by setting up our usage environment: TaskScheduler* scheduler = BasicTaskScheduler::createNew(); env = BasicUsageEnvironment::createNew(*scheduler); // 描述資訊 char const* descriptionString = "Session streamed by \"testH264VideoStreamer\""; // RTSP伺服器,埠為8554 RTSPServer* rtspServer = RTSPServer::createNew(*env, 8554); if (rtspServer == NULL) { *env << "Failed to create RTSP server: " << env->getResultMsg() << "\n"; exit(1); } // 組播 // Create 'groupsocks' for RTP and RTCP: struct in_addr destinationAddress; destinationAddress.s_addr = chooseRandomIPv4SSMAddress(*env); const unsigned short rtpPortNum = 18888; const unsigned short rtcpPortNum = rtpPortNum+1; const unsigned char ttl = 255; const Port rtpPort(rtpPortNum); const Port rtcpPort(rtcpPortNum); Groupsock rtpGroupsock(*env, destinationAddress, rtpPort, ttl); rtpGroupsock.multicastSendOnly(); // we're a SSM source Groupsock rtcpGroupsock(*env, destinationAddress, rtcpPort, ttl); rtcpGroupsock.multicastSendOnly(); // we're a SSM source // Create a 'H264 Video RTP' sink from the RTP 'groupsock': OutPacketBuffer::maxSize = 200000; videoSink = H264VideoRTPSink::createNew(*env, &rtpGroupsock, 96); // Create (and start) a 'RTCP instance' for this RTP sink: const unsigned estimatedSessionBandwidth = 500; // in kbps; for RTCP b/w share const unsigned maxCNAMElen = 100; unsigned char CNAME[maxCNAMElen+1]; gethostname((char*)CNAME, maxCNAMElen); CNAME[maxCNAMElen] = '\0'; // just in case RTCPInstance* rtcp = RTCPInstance::createNew(*env, &rtcpGroupsock, estimatedSessionBandwidth, CNAME, videoSink, NULL /* we're a server */, True /* we're a SSM source */); // Note: This starts RTCP running automatically char const* streamName = "h264ESVideoMulticast"; ServerMediaSession* sms = ServerMediaSession::createNew(*env, streamName, inputFileName, descriptionString, True /*SSM*/); sms->addSubsession(PassiveServerMediaSubsession::createNew(*videoSink, rtcp)); rtspServer->addServerMediaSession(sms); announceStream(rtspServer, sms, streamName, inputFileName); // Start the streaming: *env << "Beginning streaming...\n"; play(); // 播放 //////////////////////////////////////////////////////////////////////// // 單播 { char const* streamName = "h264ESVideo"; ServerMediaSession* sms = ServerMediaSession::createNew(*env, streamName, streamName, descriptionString); sms->addSubsession(H264VideoFileServerMediaSubsession ::createNew(*env, inputFileName, reuseFirstSource)); rtspServer->addServerMediaSession(sms); announceStream(rtspServer, sms, streamName, inputFileName); } env->taskScheduler().doEventLoop(); // does not return return 0; // only to prevent compiler warning } // 繼續讀取檔案 void afterPlaying(void* /*clientData*/) { *env << "...done reading from file\n"; videoSink->stopPlaying(); Medium::close(videoSource); // Note that this also closes the input file that this source read from. // Start playing once again: play(); } void play() { // Open the input file as a 'byte-stream file source': ByteStreamFileSource* fileSource = ByteStreamFileSource::createNew(*env, inputFileName); if (fileSource == NULL) { *env << "Unable to open file \"" << inputFileName << "\" as a byte-stream file source\n"; exit(1); } FramedSource* videoES = fileSource; // Create a framer for the Video Elementary Stream: videoSource = H264VideoStreamFramer::createNew(*env, videoES); // Finally, start playing: *env << "Beginning to read from file...\n"; videoSink->startPlaying(*videoSource, afterPlaying, videoSink); } void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms, char const* streamName, char const* inputFileName) { char* url = rtspServer->rtspURL(sms); UsageEnvironment& env = rtspServer->envir(); env << "\n\"" << streamName << "\" stream, from the file \"" << inputFileName << "\"\n"; env << "Play this stream using the URL \"" << url << "\"\n"; delete[] url; }
李遲2015.12.20 週六 晚