Quantcast
Channel: Intel® Software - Media
Viewing all 2185 articles
Browse latest View live

MFXVideoDECODE_Init() returned MFX_ERR_NOT_INITIALIZED for opaque memory

$
0
0

Greetings, I am having problems to get my code to work using the opaque memory. My code works with either the system memory or the video memory.

The setup is like this:

    Decoder + VPP + Encoder

where the Decoder decodes a 1920x1088 H.264 bit-stream, VPP resizes the raw video into 640x480 resolution, and the Encoder encodes the raw video into another H.264 bit-stream.

in the system memory mode, all the frame surfaces are allocated in the system memory, using the default frame allocator. Likewise, in the video memory mode, all the frame surfaces are allocated in the video memory, using the frame allocator provided by VA API. In both cases, each allocated frame surface is managed by an mfxFrameSurface1, which is allocated separately. My code works OK in both the system memory and video memory modes.

However, it does not work at all in the opaque memory mode, in which case I don't allocate any frame surfaces at all. I create two separate arrays of mfxFrameSurface1 pointers. Assign the starting address of the 1st array to the decoder's mfxExtOpaqueSurfaceAlloc, and assign the starting address of the 2nd array to the encoder's mfxExtOpaqueSurfaceAlloc.

Then I set the decoder's ExtParam to point to its mfxExtOpaqueSurfaceAlloc, and the encoder's to its own.

I can't think of anything else I need to configure, but the call to MFXVideoDECODE_Init() always returns error -8 (MFX_ERR_NOT_INITIALIZED). I've attached the MFX tracer log file. Could some MSDK experts review it, and tell me what is missing?

Thanks!

Robby


How is mfxFrameAllocRequest.NumFrameSuggested determined and how to use?

$
0
0

Hi, 

I'm using the Intel Media SDK to encode H264 video. When I use MFXVideoENCODE_QueryIOSurf() to retrieve the number of suggested video frames, the resulting mfxFrameAllocRequest.NumFrameSuggested is 1. I'm encoding at 60fps and although it works in most cases, though sometimes I receive a MFX_WRN_IN_EXECUTION result from MFXVideoCORE_SyncOperation. I can solve this by allocating more surfaces. Though how do I determine how many surfaces I need to allocated when the NumFrameSuggested is not enough?

Thanks
Diederick

MFXVideoCORE_SyncOperation() keeps returning MFX_WRN_IN_EXECUTION

$
0
0

Hi,

I'm using the INDE Media SDK, 6.0.0.388 and using the encoder API for a real time video streaming application. Encoding runs fine for about 15 minutes but after that calls to MFXVideoCore_SyncOperation() keeps returning MFX_WRN_IN_EXECUTION. When I use a 'wait' of e.g. 60000 as used in the tutorials (mediasdk-tutorials-0.0.3/simple_3_encode_vmem_async) the call will block until it times out after 60 seconds. Waiting for 60 seconds for a real time stream is of course not usable and therefore I've set the wait parameter to MFXVideoCode_SyncOperation() to 10 ms. When I use 10ms, I keep receiving MFX_WRN_IN_EXECUTION. 

My encode function looks like this (I removed some redundant checks):

  int VideoEncoderQuickSync::encode(MediaPacket* pkt) {

    uint32_t size = 0;
    uint8_t* size_ptr = NULL;
    mfxStatus status = MFX_ERR_NONE;
    VideoEncoderQuickSyncTask* task = NULL;
    VideoEncoderQuickSyncSurface* surf = NULL;
    VideoFrameEncoded encoded_frame;


    if (0 != getFreeTask(&task)) {
      /*
      When there are now free tasks, we sync. Because we're doing
      a conference video stream we cannot wait forever; 10ms is more
      the max.
      */
      task = tasks[task_dx];
      status = MFXVideoCORE_SyncOperation(session, task->syncpoint, 10);

      if (MFX_ERR_NONE != status) {
        /* Dropping frame, this is where we always get MFX_WRN_IN_EXCECUTION. */
        pkt->makeFree();
        return 0;
      }
      else {

        task_dx = (task_dx + 1) % tasks.size();

        if (0 != writeEncodedTask(task)) {
          /* Never happens. */
          return -7;
        }

        /*
        Reset the bitstream and syncpoint. When we don't
        reset the bitstream we get MFX_ERR_NOT_ENOUGH_BUFFER
        results when calling MFXVideoENCODE_EncodeFrameAsync
        */
        task->syncpoint = NULL;
        task->bitstream.DataLength = 0;
        task->bitstream.DataOffset = 0;
        task->bitstream.TimeStamp = 0;
        task->bitstream.DecodeTimeStamp = 0;

        /* At this point we freed up a task, so getFreeTask() receives
           a free one and we can continue execution.
        */
        task = NULL;
        if (0 != getFreeTask(&task)) {
          SX_ERROR("Even after syncing we couldn't get a free task.");
          return -9;
        }
      }
    }

    /*
    Get a free surface. We have our own struct which has a surface and pkt member.
    The surface member is mfxFrameSurface1. The pkt member is a pointer to the
    raw input data (NV12). We need to keep these together so we know when we
    can reuse the pkt again.
    */
    if (0 != getFreeSurface(&surf)) {
      SX_ERROR("Failed to get a free surface.");
      return -8;
    }

    if (NULL != surf->pkt) {
      /*
        When we found a free surface, we can make reuse the previously
        set pkt member with NV12, so we can reuse it in the application again.
      */
      surf->pkt->makeFree();
    }

    /*
    Wrap the input pkt into the surface.
    */
    surf->surface->Data.Y = pkt->video_plane[0];
    surf->surface->Data.UV = pkt->video_plane[1];
    surf->surface->Data.TimeStamp = pkt->pts;
    surf->surface->Data.Pitch = pkt->video_width[0]; /* @todo use PitchLow / PitchHigh */
    surf->pkt = pkt;

    for (;;) {

      status = MFXVideoENCODE_EncodeFrameAsync(session, NULL, surf->surface, &task->bitstream, &task->syncpoint);

      if (MFX_ERR_NONE < status && !task->syncpoint) {
        if (MFX_WRN_DEVICE_BUSY == status) {
          /* Sleep a bit here ... (never happens) */
          sleep_millis(1);
        }
      }
      else if (MFX_ERR_NONE < status && task->syncpoint) {
        status = MFX_ERR_NONE;
        break;
      }
      else if (MFX_ERR_NOT_ENOUGH_BUFFER == status) {
        SX_ERROR("Bitstream buffer size is insufficient.");
        break;
      }
      else {
        break;
      }
    }
    return 0;
  }

What am I doing wrong? and how can I solve this?

Thanks!

 

nv12 samples/convert ?

$
0
0

Hello,

I am trying to convert yuv420p samples to NV12, in order to use that as a sample into media sdk encoder/vpp.
I used carphone qcif sample yuv format (yuv420p):
http://trace.eas.asu.edu/yuv/

The original sampled is played OK with yuv player.
But on trying to play the converted sample
ffmpeg -pix_fmt yuv420p -s 176x144 -i carphone_qcif.yuv -pix_fmt nv12
carphone_qcif_nv12.yuv
the result seems is bad when trying to play it with the same yuv player (the setting for qcif 176x144, and NV12 ).
Is there something wrong with ffmpeg convert ?

This is the ffmpeg version I'm using:
ubuntu@ubuntu-laptop:~$ ffmpeg
FFmpeg version SVN-r0.5.9-4:0.5.9-0ubuntu0.10.04.3, Copyright (c)
2000-2009 Fabrice Bellard, et al.
 

Are there any available nv12 samples to check it directly with the yuv player ?

you can find here the original sample (yuv420p) and the one after ffmpeg conversion to nv12 (which I can't play/view for  some reason):

https://drive.google.com/folderview?id=0B22GsWueReZTU3k4NUQzcFNHakE&usp=sharing

 

Thanks,

Ran

 

converting BGR frame to NV12 frame

$
0
0

Sorry if similar subject already was here - i trying to find answer but cannot..

I need to encode sequence of BGR frames to h264 stream.
I am using sample_encode Media SDK as example. This example uses as input YUV stream - so it is not my case.

The first. I am trying to use IPP functions: ippiBGRToYCrCb420_8u_AC4P3R(...) for 32 bit BGR or ippiBGRToYCrCb420_8u_C3P3R(..) for 24 bit BGR source frame. SO i obtain 3 planes for YCbCr.

The second - I wrote function which makes interleave of CbCr into single plane to be NV12 compatible. So right now all works.

BUT! Question is how to avoid mixing/interleaving step in my  algorithm?

I was expecting to find IPP function like ippiBGRToYCrCb420_8u_AC4P2R() and ippiBGRToYCrCb420_8u_C3P2R(..) but seem those function do not exist! 

What i am doing wrong? What i am missing? I expected that if HW accepts only NV12 source frame then IPP probably should provide functions to convert from ordinal formats to two-plane NV12.. Now i need to do color conversion in 2 steps, so i expect performance loss during conversion because i need intermediate memory buffer written and read.

 

MFX tracer not logging DecodeFrameAsync()

$
0
0

Hi,

On my platform (CentOS 7.1, MSDK 2015R6), the MFX tracer is working. However, the log file only records function calls like MFXVideoDECODE_Init() and MFXVideoDECODE_Close(), but not DecodeFrameAsync().

Does the MFX tracer support logging events related to DecodeFrameAsync()? How do I change the configuration for this feature? I understand that logging such events could affect the performance.

Thanks,

Robby

EncodeFrameAsync takes too long (?)

$
0
0

Hello Support team,

I wrote a small test program that use the Media Sever Studio SDK to encode 'video stream' 1024 x 768 pixels into H264.
The program actually renders a rotating line every 30mS into the frame buffer, reading it back
into system memory and send it to the SDK to encode it.
The encoding process is done in a different task that the rendering task.
Until I get the platform with Intel GPU, I run the SDK in SW implementation.
I took the 'sample_encode' program distributed with the SDK and made few
adaptations in the program to accept frames injected by the user and not read from a file.
I synchronize the encoder when a frame is valid.
The h264 encoded file is perfect, but the disturbing phenomena is that the rotating
line is not smooth. It gets stucked rondomaly from time to time (every few seconds).
After invistigating the root cause, the 'blame' fell on the MFXVideoENCODE::EncodeFrameAsync function.
Most of the time it takes around 7 to 10mS ro run, but sometimes it takes few hundred miliseconds (sometimes up
to 1 second!). I Measure the time by inserting 'SyncOperation' call, to make the encoding complete.
More strange is the fact that, running the encoder with no connection to the rendering task (encoding same frame over
and over) cause the rendering to get stucked as if the encoder use resources of the rendering task.
I tried different settings of the command line parameters with no success.
It looks like the encoding process works very hard on some frames...
My question is:
Is it a normal behavior in the compresion process? Is it only in the SW implementation?
Should I spend more effort on the issue, or leave it until I get the HW and run it with HW
accelaration?

Many thanks,

Joseph

 

 

Noticeable frame drops using camera streaming sample on Android

$
0
0

Hi,

We are seeing noticeable frame drops using the Media for Mobile Camera Streaming Sample on multiple Android devices. We are streaming to a WOWZA server from various Android devices and receiving the stream on VLC player on PC and native player on Android phones. The frame drops are readily visible on both the preview screen on the capture side (after streaming starts) as well as the playback side, although typically worst on the playback side. The framerate is set to 30fps. The issue is visible at both 640x480 and 320x240. We have tried using a fairly recent LG G4 with a strong processor and good media acceleration, but still see this issue. 

I would like to know if this is a known issue or if there are settings/configurations that we can play with to improve encode/streaming performance.

Thanks,

Amir

 


Fast forward encoded H264

$
0
0

Hello fellows,

I am not sure what is the correct file extension of the encoded output file I should set.
If I set '.h264', it will play only on VLC while setting it to '.mp4' it will play only on Windows Media Player.
In both cases, fast forwarding the video is problematic.
The Media Player won't let me FF while in the VLC I can FF, but the pictures become blocky and smeared.
My program is the 'sample_encode' that comes with the SDK, with no changes in the default parameters.
I use the SW encoder since I haven't got yet a platform with a Intel GPU.

Many thanks,

Joseph

 

MFX/MSDK and OpenCL interop on Linux ?

$
0
0

Hi, I am planning to optimize one of my modules with OpenCL.

I am using CentOS 7.1 and MSDK 2015R6. I looked up the example/sample codes provided by Intel. Two examples that deal with MSDK and OpenCL interop are all for Windows, as they both depend on the function clEnqueueAcquireDX9MediaSurfacesKHR() to use media surfaces as OpenCL memory objects, and that function only works with the DX9 adapter API.

Two other examples (motion estimation) that do work on Linux appear to be just OpenCL. I don't see MSDK and OpenCL interop in them.

Is MFX/MSDK and OpenCL interop currently supported on Linux ?

Thanks,

Robby

Memory leak?

$
0
0

With this piece of test code memory seems to leak. Am I doing something wrong?

while(1)
{
    mfxVersion min_version;

    min_version.Major = 1;
    min_version.Minor = 0;

    m_mfxSession.Init(MFX_IMPL_HARDWARE_ANY, &min_version);
    m_mfxSession.Close();
}

I'm using the video driver dated 5/25/2015, version 10.18.10.4226

Way to output/dump Encoded bitstream in sample_h265_gaa

$
0
0

Hey, 

I successfully installed Intel MSS R6 and build the sample_h265_gaa application using visual studio. I passed the following commandline and it worked fine, 

sample_h265_gaa.exe -i E:\apple_tree.yuv -w 1920 -h 1080

But I want to know if there is a way to output the encoded bitstream from the application ? Similar to the -o outputfile.265 option in the sample_encode application. 

Also, what is the format for the 'params.txt' file ? 

System specs- Windows 8, Intel i5 4400, Intel HD Graphics 4600

question about sample_decode

$
0
0

hi Intel-giant,

OS: Ubuntu 12.04

       MediaSamples_Linux_6.0.16043175.175

       MediaServerStudioEssentials2015R6

Platform:  i5-4570S

       I use sample_decode_drm and got the following messages, I have questions

       Q1. the frame number is 187 and the "ReadNextFrame" was shown only 4 times. Is there any document describing this? Or could you explain this?

       Q2. I have a frame based bit-stream, and I have no clear idea about feeding it. Any hint/document for me?

[release] $ ./sample_decode_drm h264 -sw -i out.264
############# (sample_utils.cpp|ReadNextFrame|511)
Decoding Sample Version 0.0.000.0000

Input video     AVC
Output format   YUV420
Resolution      1920x1088
Crop X,Y,W,H    0,0,0,0
Frame rate      30.00
Memory type             system
MediaSDK impl           sw
MediaSDK version        1.16

Decoding started
############# (sample_utils.cpp|ReadNextFrame|511)
############# (sample_utils.cpp|ReadNextFrame|511)
############# (sample_utils.cpp|ReadNextFrame|511)
Frame number:  187, fps: 245.124, fread_fps: 0.000, fwrite_fps: 0.000
Decoding finished

 

[release] $ ./sample_decode_drm h264 -hw -i out.264
############# (sample_utils.cpp|ReadNextFrame|511)
libva info: VA-API version 0.35.0
libva info: va_getDriverName() returns 0
libva info: User requested driver 'iHD'
libva info: Trying to open /opt/intel/mediasdk/lib64/iHD_drv_video.so
libva info: Found init function __vaDriverInit_0_32
libva info: va_openDriver() returns 0
Decoding Sample Version 0.0.000.0000

Input video     AVC
Output format   YUV420
Resolution      1920x1088
Crop X,Y,W,H    0,0,0,0
Frame rate      30.00
Memory type             system
MediaSDK impl           hw
MediaSDK version        1.16

Decoding started
############# (sample_utils.cpp|ReadNextFrame|511)
############# (sample_utils.cpp|ReadNextFrame|511)
############# (sample_utils.cpp|ReadNextFrame|511)
Frame number:  187, fps: 688.082, fread_fps: 0.000, fwrite_fps: 0.000
Decoding finished

mfxExtCodingOptionSPSPPS from MFXVideoENCODE_Query

$
0
0

Hi,

We are resetting the encoder when the aspect ratio changes. We need the updated sps/pps before we call MFXVideoENCODE_Init, so MFXVideoENCODE_GetVideoParam isn't useful as it returns the current sps/pps and not the future one. We tried mfxExtCodingOptionSPSPPS with MFXVideoENCODE_Query but it fails with various errors, depending if we set mfxExtCodingOptionSPSPPS in the input and/or output mfxVideoParam. We could have a separate (joined) session just to get the sps/pps, but it seems heavyweight. How do you get sps/pps from mfxVideoParam without initializing anything?

Bruno

Question for INDE Media sdk with previous HEVC SW plugin

$
0
0
  • Product version  - INDE Update2 (Media SDK for Windows 6.0.0.388)
  • Platform you are building on – Windows 8.1 64bit  VisualStudio 2013. build for WIN32 enviroment

Hi. now I'm working for upgrading MediaSDK from 2013R3 to 2015 to support  HEVC HW decoding.

However I'm having a difficulty for HEVC SW plug-in loading.

My application was developed with MediaSDK 2013R3 and HEVC SW codec plugin at win32 enviroment.
So I changed all header and lib from INDE's Media SDK for Windows 6.0.0.388 and changed some plug-in loading code.
(plugin_loader.h is totally different with previous version(2013r3))

Here is my code------------------------------------------------------------------------------

msdk_so_handle m_PluginModule = msdk_so_load(L"mfxplugin32_hevcd_sw.dll");

   // Load Create function
   PluginModuleTemplate::fncCreateDecoderPlugin pCreateFunc = (PluginModuleTemplate::fncCreateDecoderPlugin)msdk_so_get_addr(m_PluginModule, "mfxCreateDecoderPlugin");

   MFXDecoderPlugin *m_pPlugin;
   m_pPlugin = (*pCreateFunc)();

   mfxPlugin plg;
   MSDK_ZERO_MEMORY(plg);
   plg = make_mfx_plugin_adapter(m_pPlugin);
   
   sts = MFXVideoUSER_Register(session, MFX_PLUGINTYPE_VIDEO_DECODE, &plg);

-------------------------------------------------------------------------------------------------------

 

However when i call 'MFXVideoUSER_Register', it return MFX_ERR_UNKNOWN(-1)

Here is trace log-------------------------------------------------------------------------------

8100 2015-11-25 15:55:7 function: MFXVideoUSER_Register(mfxSession session=012AC298, mfxU32 type=1, mfxPlugin *par=0102DB94) +
8100 2015-11-25 15:55:7     mfxSession session=012CAA84
8100 2015-11-25 15:55:7     mfxU32 type=1
8100 2015-11-25 15:55:7     par.pthis=01314BD8
8100 2015-11-25 15:55:7     par.PluginInit=010E1EC9
8100 2015-11-25 15:55:7     par.PluginClose=010E171C
8100 2015-11-25 15:55:7     par.GetPluginParam=010E1785
8100 2015-11-25 15:55:7     par.Submit=00000000
8100 2015-11-25 15:55:7     par.Execute=010E10A5
8100 2015-11-25 15:55:7     par.FreeResources=010E1582
8100 2015-11-25 15:55:7     par.reserved[]={ CCCCCCCC, CCCCCCCC, CCCCCCCC, CCCCCCCC, CCCCCCCC, CCCCCCCC, CCCCCCCC, CCCCCCCC }

8100 2015-11-25 15:55:7 >> MFXVideoUSER_Register called
8100 2015-11-25 15:55:7     mfxSession session=012CAA84
8100 2015-11-25 15:55:7     mfxU32 type=1
8100 2015-11-25 15:55:7     par.pthis=01314BD8
8100 2015-11-25 15:55:7     par.PluginInit=010E1EC9
8100 2015-11-25 15:55:7     par.PluginClose=010E171C
8100 2015-11-25 15:55:7     par.GetPluginParam=010E1785
8100 2015-11-25 15:55:7     par.Submit=00000000
8100 2015-11-25 15:55:7     par.Execute=010E10A5
8100 2015-11-25 15:55:7     par.FreeResources=010E1582
8100 2015-11-25 15:55:7     par.reserved[]={ CCCCCCCC, CCCCCCCC, CCCCCCCC, CCCCCCCC, CCCCCCCC, CCCCCCCC, CCCCCCCC, CCCCCCCC }

8100 2015-11-25 15:55:7 function: MFXVideoUSER_Register(0.235479 msec, status=MFX_ERR_UNKNOWN) -

 

----------------------------------------------------------------------------------------------------------

 

Please give me advise.

Thanks advance.

 


simple_decode Error! Device operation failure. src/simple_decode.cpp 197

$
0
0

Hi, I tried to run few simple code on Linux CentOS 7, using MediaServerStudioEssentials2015R6. installed following
MediaServerStudioEssentials2015R6\SDK2015Production16.4.2.1\media_server_studio_getting_started_guide.pdf
and try running one example applications in
http://software.intel.com/sites/default/files/mediasdk-tutorials-0.0.3.tar.gz
but emerge some problems~

Platform:
——————————————————————————————————————————
[root@storm _build]# cat /proc/cpuinfo | grep name | cut -f2 -d: | uniq -c
      8  Intel(R) Core(TM) i7-4790 CPU @ 3.60GHz
[root@storm _build]# lspci -nn -s 00:02.0
00:02.0 VGA compatible controller [0300]: Intel Corporation Xeon E3-1200 v3/4th Gen Core Processor Integrated Graphics Controller [8086:0412]

(rev 06)
——————————————————————————————————————————

Error:
——————————————————————————————————————————
[root@storm _build]# ./simple_decode test_stream.264 output_stream_de.yuv
libva info: VA-API version 0.35.0
libva info: va_getDriverName() returns 0
libva info: User requested driver 'iHD'
libva info: Trying to open /opt/intel/mediasdk/lib64/iHD_drv_video.so
libva info: Found init function __vaDriverInit_0_32
libva info: va_openDriver() returns 0

 Device operation failure. src/simple_decode.cpp 197
——————————————————————————————————————————

What should I do? can anyone help me? Thanks~

question about simple_decode

$
0
0

hi Intel-giant,

OS: Ubuntu 12.04

       MediaSamples_Linux_6.0.16043175.175

       MediaServerStudioEssentials2015R6

Platform:  i5-4570S

     I got the same messages as below in the local machine. Please help to clarify. Thanks

     I have referred to  https://software.intel.com/en-us/forums/intel-media-sdk/topic/543365

[release] $ ./sample_decode_x11 h264 -i 1920x1088.264         ==> this is ok
libva info: VA-API version 0.35.0
libva info: va_getDriverName() returns 0
libva info: User requested driver 'iHD'
libva info: Trying to open /opt/intel/mediasdk/lib64/iHD_drv_video.so
libva info: Found init function __vaDriverInit_0_32
libva info: va_openDriver() returns 0
Decoding Sample Version 0.0.000.0000

Input video     AVC
Output format   YUV420
Resolution      1920x1088
Crop X,Y,W,H    0,0,0,0
Frame rate      30.00
Memory type             system
MediaSDK impl           hw
MediaSDK version        1.16

Decoding started
Frame number: 8117, fps: 701.897, fread_fps: 0.000, fwrite_fps: 0.000
Decoding finished
[release] $ pwd
/opt/intel/mediasdk/samples/__cmake/intel64.make.release/__bin/release

 

[_build] $ ./simple_session -hw                                  ==> this is not ok
libva info: VA-API version 0.35.0
libva info: va_getDriverName() returns 1
libva error: va_getDriverName() failed with operation failed,driver_name=i965
terminate called after throwing an instance of 'std::bad_alloc'
  what():  std::bad_alloc
Aborted (core dumped)
[_build] $ ./simple_decode -hw                                   ==> this is not ok
error: source file name not set (mandatory)
[_build] $ ./simple_decode -hw 1920x1088.264         ==> this is not ok
libva info: VA-API version 0.35.0
libva info: va_getDriverName() returns 1
libva error: va_getDriverName() failed with operation failed,driver_name=i965
terminate called after throwing an instance of 'std::bad_alloc'
  what():  std::bad_alloc
Aborted (core dumped)
[_build] $ pwd
/home/dspuser/Downloads/mediasdk-tutorials-0.0.3/_build

Media SDK in GVT-D or GVT-G

$
0
0

Hello,

I would like to try to use the Intel Media SDK in a GVT-D or GVT-G configuration under Xen or KVM (Esxi seems not good at any of the two technology).

The problem is: which Chipset and CPU supports GVT-*

Intel is not clear regarding the support. Reading around it seems you need a Xeon with Iris PRO. Iris and HD are not enough. Nor a Core i7 with Iris Pro.

Is there a particular place where you can check for it?

Someone managed to install everything of a Gigabyte Brix that uses a 4th gen core i7. In Xen HCL you see just Xeon E3-12** between the supported processors but just for passthrough. Around internet seems that passthrough should work with intel intel HD > 4600 and Iris and Iris Pro.

The situation is mixed, and the chipset that you need for everything to work too. 

The bios is another story apart, but that problem is solvable speaking directly to the company producing the motherboard.

Is there someone that knows some of the answers?

MFXVideoENCODE_Query() always returns MFX_ERR_UNSUPPORTED

$
0
0

Hi,

I need a help with MFXVideoENCODE_Query() from Intel Media SDK. I wrote code posted below and unfortunately it doesn't work. MFXVideoENCODE_Query() always returns error MFX_ERR_UNSUPPORTED. I played with parameters, but each time failed.

At the same time, I'm able to encode video with these parameters using HW HEVC encoder on my Skylake. So, only MFXVideoENCODE_Query() doesn't work correctly for me.

I tried do the same for AVC codec and also tried to call MFXVideoENCODE_Query() after MFXVideoUSER_Load(), but in both cases I constantly get the same error.

I'm running on Windows 7 64bit. Video driver version is 10.18.15.4279 / 24.08.2015.

Thanks in advance!

    mfxVideoParam mfx_settings_in = { 0 }, mfx_settings_out = { 0 };
    mfx_settings_in.mfx.CodecId          = MFX_CODEC_HEVC;
    mfx_settings_in.mfx.CodecProfile = MFX_PROFILE_HEVC_MAIN;
    mfx_settings_in.mfx.CodecLevel = MFX_LEVEL_UNKNOWN;
    mfx_settings_in.mfx.FrameInfo.FourCC = MFX_FOURCC_NV12;
    mfx_settings_in.IOPattern            = MFX_IOPATTERN_IN_SYSTEM_MEMORY;
    mfx_settings_in.mfx.TargetUsage = MFX_TARGETUSAGE_BEST_SPEED;
    mfx_settings_in.mfx.TargetKbps = 1000;
    mfx_settings_in.mfx.RateControlMethod = MFX_RATECONTROL_VBR;
    mfx_settings_in.mfx.FrameInfo.FrameRateExtN = 30;
    mfx_settings_in.mfx.FrameInfo.FrameRateExtD = 1;
    mfx_settings_in.mfx.FrameInfo.FourCC = MFX_FOURCC_NV12;
    mfx_settings_in.mfx.FrameInfo.ChromaFormat = MFX_CHROMAFORMAT_YUV420;
    mfx_settings_in.mfx.FrameInfo.PicStruct = MFX_PICSTRUCT_PROGRESSIVE;
    mfx_settings_in.mfx.FrameInfo.CropX = 0;
    mfx_settings_in.mfx.FrameInfo.CropY = 0;
    mfx_settings_in.mfx.FrameInfo.CropW = 1280;
    mfx_settings_in.mfx.FrameInfo.CropH = 720;
    mfx_settings_in.mfx.FrameInfo.Width = ALIGN32(1280);
    mfx_settings_in.mfx.FrameInfo.Height = ALIGN32(720);

    mfxStatus status;
    mfxVersion version = { 0, 1 };
    mfxSession session;
    status = MFXInit(MFX_IMPL_HARDWARE, &version, &session);
    status = MFXVideoENCODE_Query(session, &mfx_settings_in, &mfx_settings_out);
 

Regards,

Timur

Is E3-1275 v5 is supported?

$
0
0

I am working with the server studio sdk and would like to find out if this cpu : Xeon E3-1275 v5 is supported in the latest sdks (R7-windows and R6-linux)

The chipset is Intel® C236 chipset.

Thank you

koby

Viewing all 2185 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>