Jump to content

dadreamer

Members
  • Posts

    353
  • Joined

  • Last visited

  • Days Won

    35

Everything posted by dadreamer

  1. It's date when pzj coder added that define to the header. What do you mean?..
  2. @alvise Based on what Rolf said try to run this VI and report the PlayM4_GetJPEG Error Number. Preview CAM.vi
  3. He already tried to allocate a huge buffer of 3000000 bytes. This buffer is definitely large enough to hold not only a single JPEG content, but even a several BMP ones. But this doesn't work as well.
  4. You could try to switch to PlayM4_GetBMP, but I assume it gives nothing new. This is odd that even the last error number is zero.
  5. Doesn't this work also? Do you ever see PlayM4_GetJPEG returning 1 sometimes? Preview CAM.vi
  6. I slightly modified your VI, try it. Preview CAM.vi These are Windows API bitmap definitions. In 32-bit IDE they are 14 bytes and 40 bytes long respectively. So the final formula should be 54 + w * h * 4. But it is valid only if you will use PlayM4_GetBMP! For JPEG format there's a different header length to be defined.
  7. That's how some guy did that in Delphi. function GetBMP(playHandle: Longint): TBitmap; var rbs: PDWORD; ps: PChar; i1,i: longint; bSize: DWORD; ms: TMemoryStream; begin try result := TBitmap.Create(); if playHandle < 0 then exit; bSize := 3000000; ms := TMemoryStream.Create; new(ps); GetMem(ps,bSize); new(rbs); if PlayM4_GetBMP(playHandle,ps,bSize,rbs) then begin i1 := rbs^; if i1>100000 then begin ms.WriteBuffer(ps[0],i1); MS.Position:= 0; result.LoadFromStream(ms); end; end; finally FreeMemory(ps); FreeMemory(rbs); ms.Free; ps :=nil; rbs := nil; ms := nil; end; end; But that bSize := 3000000; looks not an elegant enough, so I'd suggest using PlayM4_GetPictureSize and calculating the final buffer size as sizeof(BITMAPFILEHEADER) + sizeof(BITMAPINFOHEADER) + w * h * 4 But you may test with that for now to make sure everything is working. Another option would be to use the DisplayCallback, that is set by PlayM4_SetDisplayCallBack. There the frames should already be decoded and in YV12 format, therefore you'd have to convert them to a standard RGBA or any other format of your liking.
  8. Both of your PlayM4_GetJPEG calls return FALSE (failure). You need to figure out why.
  9. Ok, almost correct. You could replace that array constant with an U64 "0" constant, then configure that CLFN parameter as Unsigned Pointer-sized Integer and pass the "0" constant to the CLFN. But if it works as it is now, then leave. Yes, now delete that "460800" constant and wire the pJpegSize from the first PlayM4_GetJPEG CLFN. That's the meaning of using two PlayM4_GetJPEG calls. First one gets the actual size, second one works with the properly allocated array of that size. Now create an output array for the terminal marked with the red arrow. If it's a valid JPEG stream, you could try to save it to a binary file with a .jpeg extension and open it in image viewer. In order to convert the stream to a common LV array you may use some extra helper VI like this one.
  10. Yes. Or you have an alternative - read the last paragraph from Rolf. Small note: you don't need to convert the array to U8 explicitly, just do a RMB click on the "0" constant and select its representation.
  11. Why you cannot?.. https://www.ni.com/docs/en-US/bundle/labview/page/glang/initialize_array.html
  12. I think now, PlayM4_InputData is necessary. PlayM4_GetJPEG doesn't have an input to provide the "raw" buffer to decode, hence PlayM4_InputData takes care of it. It's being called on each callback, so looks like it copies the buffer to its internal storage to process. Just allocate an U8 array of "w * h * 3/2" size and pass into the function. You should receive a JPEG memory stream in the pJpeg array, that could be translated into a common LV array later.
  13. Well, it's a bit unclear whether you really need it. I found some conversations, where the guys were passing NULL as hWnd to it, when they didn't want it to paint. Logically it should start the buffer copying and decoding along with the rendering of the contents to some window. But maybe it just prepares its internal structures and the window as the next one is PlayM4_InputData, which actually grabs the buffer. So I suggest implementing it with the CLFN, but setting hWnd as NULL (0) for now. You may remove it later, if it's not really needed.
  14. From my very limited experiments with a common app (not a service), the OS kills it no matter what, leaving no chance to do even some basic cleanup. You may browse through this idea and comments: Notification in Event Structure When Linux is Shuting down Sadly all the attachments there are long gone and I didn't succeed in finding their copies. But even if you would implement that workaround in your application, I highly doubt it would allow to interrupt the shutdown process to do some work in the application as it's provided by modern Windows or Mac OS.
  15. "IMKH" FourCC says it's MPEG-PS format. Most decoders should recognize it. Now you can build some logic around PlayCtrl.dll or any other decoder such as ffmpeg. Use that HikVision example and Windows Player SDK Programmer Manual to implement the decoder calls.
  16. It was obvious. Strange that we couldn't get to it so long. I think the easiest way would be to alter the callback condition like this. if (cbState == LVBooleanTrue || dwDataType == 1) { } In this case the NET_DVR_SYSHEAD packet will be posted regardless of the state of cbState button. You should receive it right after starting the playback.
  17. Yes, PlayM4 headers appear to be a C wrapper around that DLL. So it could be called even from the Event Structure, thus no C code modifications needed. But you need to receive NET_DVR_SYSHEAD in LabVIEW somehow, because it sends a MPEG-4 header, based on which the stream gets decoded later. So you need to figure out what happens in NET_DVR_SYSHEAD case in your DLL and why the event is not posted. So, try to move that DbgPrintf condition to the if (cbState == LVBooleanTrue) {...} and check this. Then if it's called OK, after NumericArrayResize add something like this: if (dwDataType==1) DbgPrintf("NumericArrayResize returned %d", err); Check this then. And so on. Keep debugging until you find something.
  18. If you're planning to implement PlayM4 API in your code, then yes. You won't be able to get NET_DVR_SYSHEAD called. But I don't know why it suddenly stopped working on your side. Maybe when you have switched from NET_DVR_SetRealDataCallBack to NET_DVR_SetStandardDataCallBack... You may try switch back just for a test. Or insert something like this into the beginning of your DataCallBack: if (dwDataType==1) DbgPrintf("NET_DVR_SYSHEAD received");
  19. In order to use them you need to adapt the callback function as shown in their examples: void CALLBACK g_RealDataCallBack_V30(LONG lRealHandle, DWORD dwDataType, BYTE *pBuffer,DWORD dwBufSize,DWORD dwUser) { HWND hWnd = GetConsoleWindow(); switch (dwDataType) { case NET_DVR_SYSHEAD: //系统头 if (!PlayM4_GetPort(&lPort)) //获取播放库未使用的通道号 { break; } //m_iPort = lPort; //第一次回调的是系统头,将获取的播放库port号赋值给全局port,下次回调数据时即使用此port号播放 if (dwBufSize > 0) { if (!PlayM4_SetStreamOpenMode(lPort, STREAME_REALTIME)) //设置实时流播放模式 { break; } if (!PlayM4_OpenStream(lPort, pBuffer, dwBufSize, 1024*1024)) //打开流接口 { break; } if (!PlayM4_Play(lPort, hWnd)) //播放开始 { break; } } case NET_DVR_STREAMDATA: //码流数据 if (dwBufSize > 0 && lPort != -1) { if (!PlayM4_InputData(lPort, pBuffer, dwBufSize)) { break; } } } } But there are a lot of things to be reworked as well. - You will need to prepare a proper bitmap buffer to have your pixels loaded into. - You will likely need to replace or complement PlayM4_InputData call with PlayM4_GetBMP / PlayM4_GetJPEG call. - You will need to remake PostLVUserEvent call and some other functions around it.
  20. - The "1" case is for NET_DVR_SYSHEAD capture, but you assign NET_DVR_STREAMDATA name inside it; the "2" case is for NET_DVR_STREAMDATA capture. - You have unwired tunnel for blue wire in some frames of the Event Structure. Don't you see that small rect is not painted completely?.. It's senseless! You can input any values in that cluster constant and this changes literally nothing. This constant is only to define the User Event data type and that's all. So can you call the functions from the PlayM4 SDK now? You need to find a suitable function to decode the stream and receive a ready-to-use bitmap or pixels array. I don't know exactly which one is okay as I can't find the documentation (but I barely searched, to be honest, because was busy today). Ok, I found these: //get bmp or jpeg PLAYM4_API BOOL __stdcall PlayM4_GetBMP(LONG nPort,PBYTE pBitmap,DWORD nBufSize,DWORD* pBmpSize); PLAYM4_API BOOL __stdcall PlayM4_GetJPEG(LONG nPort,PBYTE pJpeg,DWORD nBufSize,DWORD* pJpegSize); Do you have them in your headers?
  21. It's difficult for me to find the errors source from your screenshot. No. Just replace the filename string constant with your randomised filename and the frame will be recorded to a new file on each event.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.