上一篇 基于Sharpdx(C#+DirectX)实现VR 播放器(一)3D图片播放
我们通过简单的对 Holographic DirectX 11 App (Universal Windows) 的程序模板的简单编程实现了一个播放3D左右格式图片的VR播放器,这一章节我们还是继续上一个章节的代码继续改造,完成3D左右格式视频的播放功能。
我们先从github上下载一份上一个章节的完成代码, 在这个项目的代码基础上继续本章节内容。
下载地址:https://github.com/mshololens/VR-Player/releases , 下载Image_Player zip包
这次我们用MediaPlayer类来替换上一节中图片更新的功能来实现视频播放。
在程序编写之前,先在 解决方案资源管理器 中项目的 引用 上右键单击 管理NuGet程序包,搜索 SharpDX 将 SharpDX.DXGI 和 SharpDX.MediaFoundation 库安装进来。
第一步:首先实现一个MediaPlayer类
用Visual Studio 打开下载下来的VR-Player项目,在 解决方案资源管理器 中的 Content 文件夹上右键单击,在弹出的菜单中选择 添加->新建项 在对话框中选择 类 命名为 MediaPlayer.cs 后确认添加。 MediaPlayer.cs的具体代码内容如下:
using SharpDX.Direct3D;
using SharpDX.Direct3D11;
using SharpDX.DXGI;
using SharpDX.MediaFoundation;
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace VR_Player.Content
{
class MediaPlayer
{
private MediaEngineEx mediaEngine;
private bool ready = false;
private Texture2D texture;
private Surface dxgiSurface;
public ShaderResourceView textureView;
private SharpDX.Direct3D11.Device3 d3dDevice;
//public static String FileName = "C:\Data\Users\DefaultAccount\Pictures\Camera Roll\publichd.best.of.3D.05.1080p_201731221849.mp4";
public static String FileName = Windows.ApplicationModel.Package.Current.InstalledLocation.Path
+ "\Assets\publichd.best.of.3D.05.1080p_201731221849.mp4";
public MediaPlayer(SharpDX.Direct3D11.Device3 device)
{
d3dDevice = device;
InitMediaEngine();
}
public async void InitMediaEngine()
{
MediaManager.Startup();
DeviceMultithread mt = d3dDevice.QueryInterface();
mt.SetMultithreadProtected(true);
DXGIDeviceManager deviceManager = new DXGIDeviceManager();
deviceManager.ResetDevice(d3dDevice);
MediaEngineAttributes attr = new MediaEngineAttributes();
attr.VideoOutputFormat = (int)SharpDX.DXGI.Format.B8G8R8A8_UNorm;
attr.DxgiManager = deviceManager;
MediaEngineClassFactory mediaFactory = new MediaEngineClassFactory();
var mEngine = new MediaEngine(
mediaFactory,
attr,
MediaEngineCreateFlags.None,
mediaEngine_PlaybackEvent);
this.mediaEngine = mEngine.QueryInterface();
Windows.Storage.StorageFile sampleFile = await Windows.Storage.StorageFile.GetFileFromPathAsync(FileName);
var stream = await sampleFile.OpenAsync(Windows.Storage.FileAccessMode.Read);
ByteStream byteStream = new ByteStream(stream);
this.mediaEngine.SetSourceFromByteStream(byteStream, FileName);
}
public void mediaEngine_PlaybackEvent(MediaEngineEvent mediaEvent, long param1, int param2)
{
Debug.WriteLine("mediaEvent :" + mediaEvent);
if (mediaEvent == MediaEngineEvent.CanPlay)
{
if (mediaEngine.Error != null)
{
Debug.WriteLine("mediaEngine Error :" + mediaEngine.Error);
return;
}
if (this.mediaEngine.HasVideo())
{
int width = 0;
int height = 0;
this.mediaEngine.GetNativeVideoSize(out width, out height);
texture = new SharpDX.Direct3D11.Texture2D(
d3dDevice,
new SharpDX.Direct3D11.Texture2DDescription()
{
ArraySize = 1,
Width = width,
Height = height,
Usage = SharpDX.Direct3D11.ResourceUsage.Default,
Format = Format.B8G8R8A8_UNorm,
CpuAccessFlags = SharpDX.Direct3D11.CpuAccessFlags.None,
BindFlags = SharpDX.Direct3D11.BindFlags.RenderTarget | SharpDX.Direct3D11.BindFlags.ShaderResource,
OptionFlags = SharpDX.Direct3D11.ResourceOptionFlags.None,
SampleDescription = new SampleDescription(1, 0),
MipLevels = 1,
});
this.dxgiSurface = this.texture.QueryInterface();
this.textureView = new ShaderResourceView(d3dDevice, texture);
ready = true;
this.mediaEngine.Play();
}
}
else if (mediaEvent == MediaEngineEvent.Error)
{
Debug.WriteLine("mediaEngine Error :" + mediaEngine.Error.GetErrorCode());
}
}
public void TransferVideoFrame()
{
if (!ready)
{
Debug.WriteLine("ready is false");
return;
}
if (this.dxgiSurface == null)
{
Debug.WriteLine("dxgiSurface is null");
return;
}
long ts;
if (!mediaEngine.OnVideoStreamTick(out ts))
{
Debug.WriteLine("mediaEngine.OnVideoStreamTick(out ts) is false");
return;
}
if (ts < 0)
return;
this.mediaEngine.TransferVideoFrame(
this.dxgiSurface,
null,
new SharpDX.Mathematics.Interop.RawRectangle(0, 0, this.texture.Description.Width, this.texture.Description.Height),
null
);
}
}
}
public static String FileName 静态熟悉用来指定视频文件的地址,这里是在声明的时候直接赋值的,也可以在其它调用的地方动态赋值。在本例中是播放 Assets 目录下的 publichd.best.of.3D.05.1080p_201731221849.mp4 文件.
你也可以将自己的3D左右格式的视频文件上传到HoloLens的Camera Roll目录中,然后将 FileName 属性复制为 "C:\Data\Users\DefaultAccount\Pictures\Camera Roll\你的3D视频文件名"
public MediaPlayer(SharpDX.Direct3D11.Device3 device) 这是 MediaPlayer 类的构造方法
public async void InitMediaEngine() 该方法初始化mediaEngine属性,并打开 FileName 指定的视频文件加载到 mediaEngine 中。
public void mediaEngine_PlaybackEvent(MediaEngineEvent mediaEvent, long param1, int param2) 该方法定义了MediaEngine的事件处理方法,这里只对 CanPlay事件做了处理,你可以继续完成 NotifyStableState LoadedMetadata Play Pause Ended TimeUpdate Error 这些事件的业务逻辑处理,让播放器更加完美。
public void TransferVideoFrame() 该方法也是最为关键的一个方法之一,利用了 mediaEngine 的 TransferVideoFrame 方法将视频当前帧的图像画面渲染到 texture 的 DXGI.Surface 上,模型每次渲染更新的时候都会调用一次此方法。
第二步:更新VR_Player.Content.SpinningCubeRenderer类文件实现视频渲染到模型上
1.在类的属性声明部分将 texture 属性声明注释掉,添加 mediaPlayer 属性声明,代码如下:
//private Texture2D texture;
private MediaPlayer mediaPlayer;
2.在 CreateDeviceDependentResourcesAsync() 方法里将ChangeImage();语句注释掉,添加 MediaPlayer 实例化赋值给 mediaPlayer 属性,代码如下:
//更新图片
//ChangeImage();
//初始化MediaPlayer
mediaPlayer = new MediaPlayer(deviceResources.D3DDevice);
3.在 Render() 方法里调用 mediaPlayer 实例的 TransferVideoFrame() 将视频帧画面进行转换,代码如下:
// Loading is asynchronous. Resources must be created before drawing can occur.
if (!this.loadingComplete)
{
return;
}
mediaPlayer.TransferVideoFrame();
4.修改Render() 方法里的 context.PixelShader.SetShaderResource(0, textureView),将textureView更换为 mediaPlayer 的 textureView** 属性,代码如下:
context.PixelShader.SetShaderResource(0, mediaPlayer.textureView);
至此实现VR 视频播放器播放3D左右格式视频的功能就完成了。连上HoloLens 按 F5 运行一下体验下用HoloLens看小电影的快感吧,haha
完整代码到 Mirosoft Hololens 中国社区 Github 代码库 中下载
转载请注明出处,否则请勿转载:
【祥哥带你玩HoloLens开发】基于Sharpdx(C#+DirectX)实现VR 播放器(二)3D视频播放 Microsoft HoloLens 中国社区
http://mshololens.cn/discussion/129/xiang-ge-dai-ni-wan-HoloLens-kai-fa-ji-yu-Sharpdx-C-DirectX-shi-xian-VR-bo-fang-qi-er-3D-shi-pin-bo-fang