Jump to content

dadreamer

Members
  • Posts

    350
  • Joined

  • Last visited

  • Days Won

    34

Posts posted by dadreamer

  1. I slightly modified your VI, try it.

    Preview CAM.vi

    1 hour ago, alvise said:

    Where did "BITMAPFILEHEADER" and "BITMAPINFOHEADER" come from here?

    These are Windows API bitmap definitions. In 32-bit IDE they are 14 bytes and 40 bytes long respectively. So the final formula should be 54 + w * h * 4. But it is valid only if you will use PlayM4_GetBMP! For JPEG format there's a different header length to be defined.

  2. That's how some guy did that in Delphi.

    function GetBMP(playHandle: Longint): TBitmap;
    var rbs: PDWORD;
         ps: PChar;
         i1,i: longint;
         bSize: DWORD;
         ms: TMemoryStream;
    begin
       try
         result := TBitmap.Create();
         if playHandle < 0 then exit;
         bSize := 3000000;
         ms := TMemoryStream.Create;
         new(ps);
         GetMem(ps,bSize);
         new(rbs);
         if PlayM4_GetBMP(playHandle,ps,bSize,rbs) then begin
           i1 := rbs^;
           if i1>100000 then begin
             ms.WriteBuffer(ps[0],i1);
             MS.Position:= 0;
             result.LoadFromStream(ms);
           end;
         end;
       finally
         FreeMemory(ps);
         FreeMemory(rbs);
         ms.Free;
         ps :=nil;
         rbs := nil;
         ms := nil;
       end;
    end;

    But that bSize := 3000000; looks not an elegant enough, so I'd suggest using PlayM4_GetPictureSize and calculating the final buffer size as

    sizeof(BITMAPFILEHEADER) + sizeof(BITMAPINFOHEADER) + w * h * 4

    But you may test with that for now to make sure everything is working. Another option would be to use the DisplayCallback, that is set by PlayM4_SetDisplayCallBack. There the frames should already be decoded and in YV12 format, therefore you'd have to convert them to a standard RGBA or any other format of your liking.

  3. Ok, almost correct. You could replace that array constant with an U64 "0" constant, then configure that CLFN parameter as Unsigned Pointer-sized Integer and pass the "0" constant to the CLFN. But if it works as it is now, then leave.

    Yes, now delete that "460800" constant and wire the pJpegSize from the first PlayM4_GetJPEG CLFN. That's the meaning of using two PlayM4_GetJPEG calls. First one gets the actual size, second one works with the properly allocated array of that size.

    image.png.36a05dee070c0ab2ad0e019618b35d1e.png.2607ec531919b06f999cba5904d952f2.png

    Now create an output array for the terminal marked with the red arrow. If it's a valid JPEG stream, you could try to save it to a binary file with a .jpeg extension and open it in image viewer. In order to convert the stream to a common LV array you may use some extra helper VI like this one.

  4. 15 minutes ago, alvise said:

    Now I want to adapt it to ''PlayM4_InputData'' labVIEW, but your previous idea was to replace it with ''PlayM4_GetJPEG''.

    I think now, PlayM4_InputData is necessary. PlayM4_GetJPEG doesn't have an input to provide the "raw" buffer to decode, hence PlayM4_InputData takes care of it. It's being called on each callback, so looks like it copies the buffer to its internal storage to process.

    15 minutes ago, alvise said:

    So I'm looking into this function, but there is ''pJpeg'' in the parameters as in the picture below, what values should be sent to it and I guess its output should be an array of images, right?

    Just allocate an U8 array of "w * h * 3/2" size and pass into the function. You should receive a JPEG memory stream in the pJpeg array, that could be translated into a common LV array later.

  5. 1 hour ago, alvise said:

    PlayM4_Play

    Well, it's a bit unclear whether you really need it. I found some conversations, where the guys were passing NULL as hWnd to it, when they didn't want it to paint. Logically it should start the buffer copying and decoding along with the rendering of the contents to some window. But maybe it just prepares its internal structures and the window as the next one is PlayM4_InputData, which actually grabs the buffer.

    So I suggest implementing it with the CLFN, but setting hWnd as NULL (0) for now. You may remove it later, if it's not really needed.

  6. 11 minutes ago, Łukasz said:

    It also works. The service can be started and stopped. However, the stop function wasn’t defined so Linux sent KillSignal and the application was killed. I think it's not safe, I don't know what happened with all those open references etc. but I’m sure that I don't have any log from the application. So the first question is how to send the signal/message to the LabVIEW app? There is a possibility to specify the execution file which will be triggered on service stop, so this can be used.

    From my very limited experiments with a common app (not a service), the OS kills it no matter what, leaving no chance to do even some basic cleanup. You may browse through this idea and comments: Notification in Event Structure When Linux is Shuting down Sadly all the attachments there are long gone and I didn't succeed in finding their copies. But even if you would implement that workaround in your application, I highly doubt it would allow to interrupt the shutdown process to do some work in the application as it's provided by modern Windows or Mac OS.

    • Thanks 1
  7. 22 minutes ago, alvise said:

    There is a dll file like below, isn't there a shorter way to get video stream information using it?

    Yes, PlayM4 headers appear to be a C wrapper around that DLL. So it could be called even from the Event Structure, thus no C code modifications needed. But you need to receive NET_DVR_SYSHEAD in LabVIEW somehow, because it sends a MPEG-4 header, based on which the stream gets decoded later. So you need to figure out what happens in NET_DVR_SYSHEAD case in your DLL and why the event is not posted.

    So, try to move that DbgPrintf condition to the if (cbState == LVBooleanTrue) {...} and check this. Then if it's called OK, after NumericArrayResize add something like this:

    if (dwDataType==1) DbgPrintf("NumericArrayResize returned %d", err);

    Check this then. And so on. Keep debugging until you find something.

  8. 4 hours ago, alvise said:

    Isn't that a problem?

    If you're planning to implement PlayM4 API in your code, then yes. You won't be able to get NET_DVR_SYSHEAD called. But I don't know why it suddenly stopped working on your side. Maybe when you have switched from NET_DVR_SetRealDataCallBack to NET_DVR_SetStandardDataCallBack... You may try switch back just for a test. Or insert something like this into the beginning of your DataCallBack:

    if (dwDataType==1) DbgPrintf("NET_DVR_SYSHEAD received");

     

  9. 11 minutes ago, alvise said:

    Yes the header file contains these calls but I don't know how to use them in C++ code. I will try to do that.

    In order to use them you need to adapt the callback function as shown in their examples:

    void CALLBACK g_RealDataCallBack_V30(LONG lRealHandle, DWORD dwDataType, BYTE *pBuffer,DWORD dwBufSize,DWORD dwUser)
    {
        HWND hWnd = GetConsoleWindow();
    
    	switch (dwDataType)
    	{
    	case NET_DVR_SYSHEAD: //系统头
    
    		if (!PlayM4_GetPort(&lPort))  //获取播放库未使用的通道号
    		{
    			break;
    		}
    		//m_iPort = lPort; //第一次回调的是系统头,将获取的播放库port号赋值给全局port,下次回调数据时即使用此port号播放
    		if (dwBufSize > 0)
    		{
    			if (!PlayM4_SetStreamOpenMode(lPort, STREAME_REALTIME))  //设置实时流播放模式
    			{
    				break;
    			}
    
    			if (!PlayM4_OpenStream(lPort, pBuffer, dwBufSize, 1024*1024)) //打开流接口
    			{
    				break;
    			}
    
    			if (!PlayM4_Play(lPort, hWnd)) //播放开始
    			{
    				break;
    			}
    		}
    	case NET_DVR_STREAMDATA:   //码流数据
    		if (dwBufSize > 0 && lPort != -1)
    		{
    			if (!PlayM4_InputData(lPort, pBuffer, dwBufSize))
    			{
    				break;
    			} 
    		}
    	}
    }

    But there are a lot of things to be reworked as well.

    - You will need to prepare a proper bitmap buffer to have your pixels loaded into.

    - You will likely need to replace or complement PlayM4_InputData call with PlayM4_GetBMP / PlayM4_GetJPEG call.

    - You will need to remake PostLVUserEvent call and some other functions around it.

  10. 3 hours ago, alvise said:

    When I make a change like below, it only saves one file, shouldn't it save a new file for each event iteration? Why does it only save one file? Is the method I'm using correct?

    - The "1" case is for NET_DVR_SYSHEAD capture, but you assign NET_DVR_STREAMDATA name inside it; the "2" case is for NET_DVR_STREAMDATA capture.

    - You have unwired tunnel for blue wire in some frames of the Event Structure. Don't you see that small rect is not painted completely?..

    49 minutes ago, alvise said:

    Are you saying it's unnecessary to send any value to "dwDataType" like in the photo?

    It's senseless! You can input any values in that cluster constant and this changes literally nothing. This constant is only to define the User Event data type and that's all.

    51 minutes ago, alvise said:

    I'm looking at this sample code as I don't have a solution at the moment. How can we use this sample code to achieve the desired result?

    So can you call the functions from the PlayM4 SDK now? You need to find a suitable function to decode the stream and receive a ready-to-use bitmap or pixels array. I don't know exactly which one is okay as I can't find the documentation (but I barely searched, to be honest, because was busy today).

    Ok, I found these:

    //get bmp or jpeg
    PLAYM4_API BOOL __stdcall PlayM4_GetBMP(LONG nPort,PBYTE pBitmap,DWORD nBufSize,DWORD* pBmpSize);
    PLAYM4_API BOOL __stdcall PlayM4_GetJPEG(LONG nPort,PBYTE pJpeg,DWORD nBufSize,DWORD* pJpegSize);

    Do you have them in your headers?

  11. 7 minutes ago, alvise said:

    Yes, I'm getting errors like the following.

    It's difficult for me to find the errors source from your screenshot.

    7 minutes ago, alvise said:

    With the method you suggested, it is necessary to stop and restart the VI for each new package.

    No. Just replace the filename string constant with your randomised filename and the frame will be recorded to a new file on each event.

  12. 10 hours ago, alvise said:

    -I'm looking at the example here. I think it is necessary to read the 'NET_DVR_SYSHEAD' package.

    Maybe. I've been browsing their samples too. Looks like they use a custom MPEG-4 decoder. In NET_DVR_SYSHEAD case they just initialize the decoder as this is the very first frame. But seems that the actual pBuffer contents of the system header aren't used in any way. Or I'm wrong? Later in NET_DVR_STREAMDATA case they obtain the packets and render them into a separate window using that MP4 decoder. Maybe it would be easier to use it instead of reinventing not only a wheel, but an entire car. Could you try to #include "plaympeg4.h" header to your code and recompile? Do you receive any errors?

    10 hours ago, alvise said:

    I took a few records as below and created the records with a normal button.

    Normal button is not suitable here due to the reasons I explained earlier. You could try to generate a random file name on each packet arrival (say, packet number or current date + time or something else), so the data gets recorded into a new file instead of being written to the same file.

  13. SetCbState returns nothing! Its return value is defined as void. That means that the function does not have a return value at all (in Pascal and Delphi such a function is called procedure). So after you set some return value for it in the CLFN settings, you keep getting some totally unrelated numbers, e.g. return values of some internal functions in LV core or whatever. It's useless to search for some correlations between that fictional return value and dwDataType. And even if there would be some connection (as after you change some constant and LabVIEW recompiles all and it affects the memory layout in some 'unusual' way), it's of no use absolutely.

    6 hours ago, alvise said:

    -I try as follows, but the size of the array does not remain constant, it constantly changes between 0 and 4350

    It seems to be more complicated then. Honestly I even don't have an idea how and where to start from. Maybe you could collect several (or more) files with different sizes and upload them?

  14. 13 hours ago, alvise said:

    I can only capture the ''NET_DVR_STREAM DATA'' packet.

    In some of your previous tests you showed that NET_DVR_SYSHEAD arrives first and is followed by a sequence of NET_DVR_STREAM_DATA packets. I can't say now, whether we really need NET_DVR_SYSHEAD to analyse NET_DVR_STREAM_DATA, but it would be useful to have both of them.

    13 hours ago, alvise said:

    I saved 2 different samples.

    There's something strange in those samples.

    2022-06-05_16-27-14.jpg.1ae30de000f074d8725873382daf6899.jpg

    20 bytes only? Really? Well, very good compression achieved! 🙂 Could you add Array Size node to your handle wire to check its actual size at runtime?

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.