窗口/屏幕截圖適用于截圖、批注等工具場景,時時獲取窗口/屏幕圖像數(shù)據(jù)流呢,下面講下視頻會議共享桌面、遠程桌面這些場景是如何實現(xiàn)畫面錄制的。
常見的屏幕畫面時時采集方案,主要有GDI、WGC、DXGI。
GDI
GDI(Graphics Device Interface)就是使用user32下WindowsAPI來實現(xiàn),是 Windows 操作系統(tǒng)中最早、最基礎(chǔ)的圖形設(shè)備接口,滿足所有windows平臺。屏幕/窗口截圖可以詳見: .NET 窗口/屏幕截圖 - 唐宋元明清2188 - 博客園 (cnblogs.com)
錄制屏幕,可以基于GDI截圖方案,使用定時器捕獲屏幕數(shù)據(jù)。
GDI性能不太好,尤其是針對高幀率及高分辨率需求,達到每秒20幀以上的截取,占用CPU就有點高了。另外GDI不能獲取鼠標,需要在截取的圖像中把鼠標畫上去。
所以GDI使用很方便、不依賴GPU,對性能要求不高的截圖場景建議直接使用這個方案。
WGC
Windows Graphics Capture ,是Win10引入的一種新截取屏幕以及截取窗口內(nèi)容的機制 Screen capture - UWP applications | Microsoft Learn
WinRT提供接口訪問,Csproj屬性中添加:<UseWinRT>true</UseWinRT>
截圖代碼實現(xiàn)示例:
public WgcCapture(IntPtr hWnd, CaptureType captureType)
{
if (!GraphicsCaptureSession.IsSupported())
{
throw new Exception("不支Windows Graphics Capture API");
}
var item = captureType == CaptureType.Screen ? CaptureUtils.CreateItemForMonitor(hWnd) : CaptureUtils.CreateItemForWindow(hWnd);
CaptureSize = new Size(item.Size.Width, item.Size.Height);
var d3dDevice = Direct3D11Utils.CreateDevice(false);
_device = Direct3D11Utils.CreateSharpDxDevice(d3dDevice);
_framePool = Direct3D11CaptureFramePool.CreateFreeThreaded(d3dDevice, pixelFormat: DirectXPixelFormat.B8G8R8A8UIntNormalized, numberOfBuffers: 1, item.Size);
_desktopImageTexture = CreateTexture2D(_device, item.Size);
_framePool.FrameArrived += OnFrameArrived;
item.Closed += (i, _) =>
{
_framePool.FrameArrived -= OnFrameArrived;
StopCapture();
ItemClosed?.Invoke(this, i);
};
_session = _framePool.CreateCaptureSession(item);
}
private void OnFrameArrived(Direct3D11CaptureFramePool sender, object args)
{
try
{
using var frame = _framePool.TryGetNextFrame();
if (frame == null) return;
var data = CopyFrameToBytes(frame);
var captureFrame = new CaptureFrame(CaptureSize, data);
FrameArrived?.Invoke(this, captureFrame);
}
catch (Exception)
{
// ignored
}
}
WGC截圖流程:
創(chuàng)建捕捉項:使用 CreateCaptureItemForMonitor 或 CreateCaptureItemForWindow 來創(chuàng)建捕捉項。
創(chuàng)建D3D11設(shè)備和上下文:調(diào)用 D3D11CreateDevice 創(chuàng)建 Direct3D 11 設(shè)備和設(shè)備上下文。這里雖然沒有使用DXGI截圖,但引用了DXGI的設(shè)備類型
轉(zhuǎn)換為 Direct3D 設(shè)備:將 D3D11 設(shè)備轉(zhuǎn)換為SharpDX Direct3D 設(shè)備對象。
創(chuàng)建幀池和會話:使用 Direct3D11CaptureFramePool 和 GraphicsCaptureSession。
開始捕捉:調(diào)用 StartCapture 開始會話,并注冊幀到達事件。
處理幀:在幀到達事件中處理捕獲的幀
我們這里是使用比較成熟的SharpDX來處理Direct3D,引用如下Nuget版本
<PackageReference Include="SharpDX" Version="4.2.0" />
<PackageReference Include="SharpDX.Direct3D11" Version="4.2.0" />
<PackageReference Include="SharpDX.DXGI" Version="4.2.0" />
獲取到截取的D3D對象幀,幀畫面轉(zhuǎn)數(shù)據(jù)流:
private byte[] CopyFrameToBytes(Direct3D11CaptureFrame frame)
{
using var bitmap = Direct3D11Utils.CreateSharpDxTexture2D(frame.Surface);
_device.ImmediateContext.CopyResource(bitmap, _desktopImageTexture);
// 將Texture2D資源映射到CPU內(nèi)存
var mappedResource = _device.ImmediateContext.MapSubresource(_desktopImageTexture, 0, MapMode.Read, MapFlags.None);
//Bgra32
var bytesPerPixel = 4;
var width = _desktopImageTexture.Description.Width;
var height = _desktopImageTexture.Description.Height;
using var inputRgbaMat = new Mat(height, width, MatType.CV_8UC4, mappedResource.DataPointer, mappedResource.RowPitch);
var data = new byte[CaptureSize.Width * CaptureSize.Height * bytesPerPixel];
if (CaptureSize.Width != width || CaptureSize.Height != height)
{
var size = new OpenCvSharp.Size(CaptureSize.Width, CaptureSize.Height);
Cv2.Resize(inputRgbaMat, inputRgbaMat, size, interpolation: InterpolationFlags.Linear);
}
var sourceSize = new Size(frame.ContentSize.Width, frame.ContentSize.Height);
if (CaptureSize == sourceSize)
{
var rowPitch = mappedResource.RowPitch;
for (var y = 0; y < height; y++)
{
var srcRow = inputRgbaMat.Data + y * rowPitch;
var destRowOffset = y * width * bytesPerPixel;
Marshal.Copy(srcRow, data, destRowOffset, width * bytesPerPixel);
}
}
else
{
Marshal.Copy(inputRgbaMat.Data, data, 0, data.Length);
}
_device.ImmediateContext.UnmapSubresource(_desktopImageTexture, 0);
return data;
}
將Surface對象轉(zhuǎn)換為獲取 SharpDX的Texture2D,映射到CPU以內(nèi)存拷貝方式輸出圖像字節(jié)數(shù)據(jù)。
上面默認是輸出三通道8位的Bgr24,如果是四通道Bgra32可以按如下從內(nèi)存拷貝:
using var inputRgbMat = new Mat();
Cv2.CvtColor(inputRgbaMat, inputRgbMat, ColorConversionCodes.BGRA2BGR);
Marshal.Copy(inputRgbMat.Data, data, 0, data.Length);
拿到字節(jié)數(shù)據(jù),就可以保存本地或者界面展示了 。
屏幕截圖Demo顯示:
private void CaptureButton_OnClick(object sender, RoutedEventArgs e)
{
var monitorHandle = MonitorUtils.GetMonitors().First().MonitorHandle;
var wgcCapture = new WgcCapture(monitorHandle, CaptureType.Screen);
wgcCapture.FrameArrived += WgcCapture_FrameArrived;
wgcCapture.StartCapture();
}
private void WgcCapture_FrameArrived(object? sender, CaptureFrame e)
{
Application.Current.Dispatcher.Invoke(() =>
{
var stride = e.Size.Width * 4; // 4 bytes per pixel in BGRA format
var bitmap = BitmapSource.Create(e.Size.Width, e.Size.Height, 96, 96, PixelFormats.Bgra32, null, e.Data, stride);
bitmap.Freeze();
CaptureImage.Source = bitmap;
});
}
WGC利用了現(xiàn)代圖形硬件和操作系統(tǒng)特性、能夠提供高性能和低延遲的屏幕捕抓,適用于實時性比較高的場景如屏幕錄制、視訊會議等應(yīng)用。
更多的,可以參考官網(wǎng)屏幕捕獲到視頻 - UWP applications | Microsoft Learn。也可以瀏覽、運行我的Demo:kybs00/CaptureImageDemo (github.com)
DXGI
全名DirectX Graphics Infrastructure。從Win8開始,微軟引入了一套新的接口Desktop Duplication API,而由于Desktop Duplication API是通過DXGI來提供桌面圖像的,速度非常快。
DXGI使用GPU,所以cpu占用率很低,性能很高。DXGI官網(wǎng)文檔:DXGI - Win32 apps | Microsoft Learn
因為DXGI也是使用DirectX,所以很多接口與WGC差不多。也就是通過D3D,各種QueryInterface,各種Enum,核心方法是AcquireNextFrame
它有個缺點,沒辦法捕獲窗口內(nèi)容。所以視訊會議共享窗口,是無法通過DXGI實現(xiàn)
我們看看Demo調(diào)用代碼,
private void CaptureButton_OnClick(object sender, RoutedEventArgs e)
{
var monitorDxgiCapture = new MonitorDxgiCapture();
monitorDxgiCapture.FrameArrived += WgcCapture_FrameArrived;
monitorDxgiCapture.StartCapture();
}
private void WgcCapture_FrameArrived(object? sender, CaptureFrame e)
{
Application.Current?.Dispatcher.Invoke(() =>
{
var stride = e.Size.Width * 4; // 4 bytes per pixel in BGRA format
var bitmap = BitmapSource.Create(e.Size.Width, e.Size.Height, 96, 96, PixelFormats.Bgra32, null, e.Data, stride);
bitmap.Freeze();
CaptureImage.Source = bitmap;
});
}
捕獲畫面幀數(shù)據(jù):
[HandleProcessCorruptedStateExceptions]
private CaptureFrame CaptureFrame()
{
try
{
var data = new byte[CaptureSize.Width * CaptureSize.Height * 4];
var result = _mDeskDupl.TryAcquireNextFrame(TimeOut, out _, out var desktopResource);
if (result.Failure) return null;
using var tempTexture = desktopResource?.QueryInterface<Texture2D>();
_mDevice.ImmediateContext.CopyResource(tempTexture, _desktopImageTexture); //拷貝圖像紋理:GPU硬件加速的紋理復(fù)制
desktopResource?.Dispose();
var desktopSource = _mDevice.ImmediateContext.MapSubresource(_desktopImageTexture, 0, MapMode.Read, MapFlags.None);
using var inputRgbaMat = new Mat(_screenSize.Height, _screenSize.Width, MatType.CV_8UC4, desktopSource.DataPointer);
if (CaptureSize.Width != _screenSize.Width || CaptureSize.Height != _screenSize.Height)
{
var size = new OpenCvSharp.Size(CaptureSize.Width, CaptureSize.Height);
Cv2.Resize(inputRgbaMat, inputRgbaMat, size, interpolation: InterpolationFlags.Linear);
}
Marshal.Copy(inputRgbaMat.Data, data, 0, data.Length);
var captureFrame = new CaptureFrame(CaptureSize, data);
_mDevice.ImmediateContext.UnmapSubresource(_desktopImageTexture, 0);
//釋放幀
_mDeskDupl.ReleaseFrame();
return captureFrame;
}
catch (AccessViolationException)
{
return null;
}
catch (Exception)
{
return null;
}
}
也是使用硬件加速將2D紋理資源拷貝,然后通過內(nèi)存拷貝輸出為字節(jié)數(shù)據(jù)。
1080P的本地錄屏、顯示,CPU、GPU使用情況如下:
1080P和WGC方案沒有明顯差別,延時也接近。但4K、8K分辨率下,DXGI方案更優(yōu),能夠直接管理圖形硬件和提供高性能渲染。它是與內(nèi)核模式驅(qū)動程序和系統(tǒng)硬件進行通信的,借用下官網(wǎng)的架構(gòu)圖:
所以在需要極低延遲和高幀率的4K場景中,DXGI能提供必要的性能優(yōu)化。
上面3個方案Demo示例,詳細代碼都在github倉庫:kybs00/CaptureImageDemo (github.com)
總結(jié)下這三個方案,
GDI:適用于所有 Windows 版本,但性能較低。
WGC:Win10 1803版本以上,高性能和低延遲,屏幕及窗口均支持。
DXGI:Win8版本以上,適用于高分辨率高幀率等高性能的需求,并且只支持屏幕錄制、不支持窗口。
錄制主要是錄屏、直播、遠程桌面、視訊會議、傳屏等場景。錄制屏幕/窗口建議優(yōu)先使用WGC,然后用DXGI兼容win8;如果僅錄制屏幕且高分辨率、高幀率場景,建議優(yōu)先DXGI
該文章在 2024/8/10 9:00:02 編輯過