EXC_BAD_ACCESS when getting pinned array handle using UnityARKitPlugin
I'm trying to read a YUV camera frame using Unity ARKit Plugin, into a pair of pinned byte arrays using SetCapturePixelData with a double-buffer. To do this, I need to first pin the array in managed memory to stop the GC changing its address, then pass the address of that array into the native plugin code, which will then write it. To prevent write/read violations I'm double buffering, so each new frame we switch the write between two pairs of byte arrays and read from the last one that was written.
On almost all compatible iOS devices, this works fine (iPhone 7, 7+, X & the iPad 9.7 2017/5th Gen); however on the iPad 2018/6th Gen, I get an EXC_BAD_ACCESS when trying to read back the address from the pinned handle, after pinning the array.
Minimally viable MonoBehaviour below:
using System;
using System.Runtime.InteropServices;
using n00dle;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.XR.iOS;
public class ExtractVideoFrameBytes : MonoBehaviour
{
CameraImage image;
private byte uvBytes;
private byte yBytes;
private byte uvBytes2;
private byte yBytes2;
private int width;
private int height;
private GCHandle m_PinnedYArray;
private IntPtr m_PinnedYAddr;
private GCHandle m_PinnedUVArray;
private IntPtr m_PinnedUVAddr;
private UnityARSessionNativeInterface iface;
private bool texturesInitialised = false;
private long currentFrameNumber = 0;
// Use this for initialization
void Start () {
UnityARSessionNativeInterface.ARFrameUpdatedEvent += UpdateFrame;
}
private void OnDestroy()
{
UnityARSessionNativeInterface.ARFrameUpdatedEvent -= UpdateFrame;
}
void UpdateFrame(UnityARCamera camera)
{
if (!texturesInitialised)
{
iface = UnityARSessionNativeInterface.GetARSessionNativeInterface();
Debug.Log("INITIALISING");
width = camera.videoParams.yWidth;
height = camera.videoParams.yHeight;
int numYBytes = camera.videoParams.yWidth * camera.videoParams.yHeight;
int numUVBytes = camera.videoParams.yWidth * camera.videoParams.yHeight / 2; //quarter resolution, but two bytes per pixel
yBytes = new byte[numYBytes];
uvBytes = new byte[numUVBytes];
yBytes2 = new byte[numYBytes];
uvBytes2 = new byte[numUVBytes];
m_PinnedYArray = GCHandle.Alloc(yBytes, GCHandleType.Pinned);
m_PinnedUVArray = GCHandle.Alloc(uvBytes, GCHandleType.Pinned);
texturesInitialised = true;
}
if (TryGetImage(ref image))
{
Debug.Log("Got an image...");
}
else
{
Debug.LogError("No image :(");
}
}
public bool TryGetImage(ref CameraImage cameraImage)
{
#if !UNITY_EDITOR && UNITY_IOS
ARTextureHandles handles = iface.GetARVideoTextureHandles();
if (handles.TextureY == IntPtr.Zero || handles.TextureCbCr == IntPtr.Zero)
return false;
if (!texturesInitialised)
return false;
long doubleBuffId = currentFrameNumber % 2;
++currentFrameNumber;
m_PinnedYArray.Free();
m_PinnedYArray = GCHandle.Alloc(doubleBuffId == 0 ? yBytes : yBytes2, GCHandleType.Pinned);
m_PinnedYAddr = m_PinnedYArray.AddrOfPinnedObject();
m_PinnedUVArray.Free();
m_PinnedUVArray = GCHandle.Alloc(doubleBuffId == 0 ? uvBytes : uvBytes2, GCHandleType.Pinned);
m_PinnedUVAddr = m_PinnedUVArray.AddrOfPinnedObject();
// Tell Unity to write the NEXT frame into these buffers
iface.SetCapturePixelData(true, m_PinnedYAddr, m_PinnedUVAddr);
// Now, read off the other buffers
cameraImage.y = (doubleBuffId == 0 ? yBytes2 : yBytes);
cameraImage.uv = (doubleBuffId == 0 ? uvBytes2 : uvBytes);
cameraImage.width = width;
cameraImage.height = height;
#endif
return true;
}
}
I don't think there's anything particularly weird with the code above, but would be keen to know if anyone can spot anything I've done wrong. Indeed, this is a stripped down version of the code used in Unity's experimental AR Interface to retrieve the camera frame.
Since this only occurs on a single device, I suspect it's a bug in the underlying native code (probably an extra free somewhere), so I've also logged this as an issue on the UnityARKitPlugin issue tracker; if I get a response, I'll update or delete this question accordingly.
Edit: The stack trace I see in XCode is given below:
c# unity3d augmented-reality arkit
add a comment |
I'm trying to read a YUV camera frame using Unity ARKit Plugin, into a pair of pinned byte arrays using SetCapturePixelData with a double-buffer. To do this, I need to first pin the array in managed memory to stop the GC changing its address, then pass the address of that array into the native plugin code, which will then write it. To prevent write/read violations I'm double buffering, so each new frame we switch the write between two pairs of byte arrays and read from the last one that was written.
On almost all compatible iOS devices, this works fine (iPhone 7, 7+, X & the iPad 9.7 2017/5th Gen); however on the iPad 2018/6th Gen, I get an EXC_BAD_ACCESS when trying to read back the address from the pinned handle, after pinning the array.
Minimally viable MonoBehaviour below:
using System;
using System.Runtime.InteropServices;
using n00dle;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.XR.iOS;
public class ExtractVideoFrameBytes : MonoBehaviour
{
CameraImage image;
private byte uvBytes;
private byte yBytes;
private byte uvBytes2;
private byte yBytes2;
private int width;
private int height;
private GCHandle m_PinnedYArray;
private IntPtr m_PinnedYAddr;
private GCHandle m_PinnedUVArray;
private IntPtr m_PinnedUVAddr;
private UnityARSessionNativeInterface iface;
private bool texturesInitialised = false;
private long currentFrameNumber = 0;
// Use this for initialization
void Start () {
UnityARSessionNativeInterface.ARFrameUpdatedEvent += UpdateFrame;
}
private void OnDestroy()
{
UnityARSessionNativeInterface.ARFrameUpdatedEvent -= UpdateFrame;
}
void UpdateFrame(UnityARCamera camera)
{
if (!texturesInitialised)
{
iface = UnityARSessionNativeInterface.GetARSessionNativeInterface();
Debug.Log("INITIALISING");
width = camera.videoParams.yWidth;
height = camera.videoParams.yHeight;
int numYBytes = camera.videoParams.yWidth * camera.videoParams.yHeight;
int numUVBytes = camera.videoParams.yWidth * camera.videoParams.yHeight / 2; //quarter resolution, but two bytes per pixel
yBytes = new byte[numYBytes];
uvBytes = new byte[numUVBytes];
yBytes2 = new byte[numYBytes];
uvBytes2 = new byte[numUVBytes];
m_PinnedYArray = GCHandle.Alloc(yBytes, GCHandleType.Pinned);
m_PinnedUVArray = GCHandle.Alloc(uvBytes, GCHandleType.Pinned);
texturesInitialised = true;
}
if (TryGetImage(ref image))
{
Debug.Log("Got an image...");
}
else
{
Debug.LogError("No image :(");
}
}
public bool TryGetImage(ref CameraImage cameraImage)
{
#if !UNITY_EDITOR && UNITY_IOS
ARTextureHandles handles = iface.GetARVideoTextureHandles();
if (handles.TextureY == IntPtr.Zero || handles.TextureCbCr == IntPtr.Zero)
return false;
if (!texturesInitialised)
return false;
long doubleBuffId = currentFrameNumber % 2;
++currentFrameNumber;
m_PinnedYArray.Free();
m_PinnedYArray = GCHandle.Alloc(doubleBuffId == 0 ? yBytes : yBytes2, GCHandleType.Pinned);
m_PinnedYAddr = m_PinnedYArray.AddrOfPinnedObject();
m_PinnedUVArray.Free();
m_PinnedUVArray = GCHandle.Alloc(doubleBuffId == 0 ? uvBytes : uvBytes2, GCHandleType.Pinned);
m_PinnedUVAddr = m_PinnedUVArray.AddrOfPinnedObject();
// Tell Unity to write the NEXT frame into these buffers
iface.SetCapturePixelData(true, m_PinnedYAddr, m_PinnedUVAddr);
// Now, read off the other buffers
cameraImage.y = (doubleBuffId == 0 ? yBytes2 : yBytes);
cameraImage.uv = (doubleBuffId == 0 ? uvBytes2 : uvBytes);
cameraImage.width = width;
cameraImage.height = height;
#endif
return true;
}
}
I don't think there's anything particularly weird with the code above, but would be keen to know if anyone can spot anything I've done wrong. Indeed, this is a stripped down version of the code used in Unity's experimental AR Interface to retrieve the camera frame.
Since this only occurs on a single device, I suspect it's a bug in the underlying native code (probably an extra free somewhere), so I've also logged this as an issue on the UnityARKitPlugin issue tracker; if I get a response, I'll update or delete this question accordingly.
Edit: The stack trace I see in XCode is given below:
c# unity3d augmented-reality arkit
add a comment |
I'm trying to read a YUV camera frame using Unity ARKit Plugin, into a pair of pinned byte arrays using SetCapturePixelData with a double-buffer. To do this, I need to first pin the array in managed memory to stop the GC changing its address, then pass the address of that array into the native plugin code, which will then write it. To prevent write/read violations I'm double buffering, so each new frame we switch the write between two pairs of byte arrays and read from the last one that was written.
On almost all compatible iOS devices, this works fine (iPhone 7, 7+, X & the iPad 9.7 2017/5th Gen); however on the iPad 2018/6th Gen, I get an EXC_BAD_ACCESS when trying to read back the address from the pinned handle, after pinning the array.
Minimally viable MonoBehaviour below:
using System;
using System.Runtime.InteropServices;
using n00dle;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.XR.iOS;
public class ExtractVideoFrameBytes : MonoBehaviour
{
CameraImage image;
private byte uvBytes;
private byte yBytes;
private byte uvBytes2;
private byte yBytes2;
private int width;
private int height;
private GCHandle m_PinnedYArray;
private IntPtr m_PinnedYAddr;
private GCHandle m_PinnedUVArray;
private IntPtr m_PinnedUVAddr;
private UnityARSessionNativeInterface iface;
private bool texturesInitialised = false;
private long currentFrameNumber = 0;
// Use this for initialization
void Start () {
UnityARSessionNativeInterface.ARFrameUpdatedEvent += UpdateFrame;
}
private void OnDestroy()
{
UnityARSessionNativeInterface.ARFrameUpdatedEvent -= UpdateFrame;
}
void UpdateFrame(UnityARCamera camera)
{
if (!texturesInitialised)
{
iface = UnityARSessionNativeInterface.GetARSessionNativeInterface();
Debug.Log("INITIALISING");
width = camera.videoParams.yWidth;
height = camera.videoParams.yHeight;
int numYBytes = camera.videoParams.yWidth * camera.videoParams.yHeight;
int numUVBytes = camera.videoParams.yWidth * camera.videoParams.yHeight / 2; //quarter resolution, but two bytes per pixel
yBytes = new byte[numYBytes];
uvBytes = new byte[numUVBytes];
yBytes2 = new byte[numYBytes];
uvBytes2 = new byte[numUVBytes];
m_PinnedYArray = GCHandle.Alloc(yBytes, GCHandleType.Pinned);
m_PinnedUVArray = GCHandle.Alloc(uvBytes, GCHandleType.Pinned);
texturesInitialised = true;
}
if (TryGetImage(ref image))
{
Debug.Log("Got an image...");
}
else
{
Debug.LogError("No image :(");
}
}
public bool TryGetImage(ref CameraImage cameraImage)
{
#if !UNITY_EDITOR && UNITY_IOS
ARTextureHandles handles = iface.GetARVideoTextureHandles();
if (handles.TextureY == IntPtr.Zero || handles.TextureCbCr == IntPtr.Zero)
return false;
if (!texturesInitialised)
return false;
long doubleBuffId = currentFrameNumber % 2;
++currentFrameNumber;
m_PinnedYArray.Free();
m_PinnedYArray = GCHandle.Alloc(doubleBuffId == 0 ? yBytes : yBytes2, GCHandleType.Pinned);
m_PinnedYAddr = m_PinnedYArray.AddrOfPinnedObject();
m_PinnedUVArray.Free();
m_PinnedUVArray = GCHandle.Alloc(doubleBuffId == 0 ? uvBytes : uvBytes2, GCHandleType.Pinned);
m_PinnedUVAddr = m_PinnedUVArray.AddrOfPinnedObject();
// Tell Unity to write the NEXT frame into these buffers
iface.SetCapturePixelData(true, m_PinnedYAddr, m_PinnedUVAddr);
// Now, read off the other buffers
cameraImage.y = (doubleBuffId == 0 ? yBytes2 : yBytes);
cameraImage.uv = (doubleBuffId == 0 ? uvBytes2 : uvBytes);
cameraImage.width = width;
cameraImage.height = height;
#endif
return true;
}
}
I don't think there's anything particularly weird with the code above, but would be keen to know if anyone can spot anything I've done wrong. Indeed, this is a stripped down version of the code used in Unity's experimental AR Interface to retrieve the camera frame.
Since this only occurs on a single device, I suspect it's a bug in the underlying native code (probably an extra free somewhere), so I've also logged this as an issue on the UnityARKitPlugin issue tracker; if I get a response, I'll update or delete this question accordingly.
Edit: The stack trace I see in XCode is given below:
c# unity3d augmented-reality arkit
I'm trying to read a YUV camera frame using Unity ARKit Plugin, into a pair of pinned byte arrays using SetCapturePixelData with a double-buffer. To do this, I need to first pin the array in managed memory to stop the GC changing its address, then pass the address of that array into the native plugin code, which will then write it. To prevent write/read violations I'm double buffering, so each new frame we switch the write between two pairs of byte arrays and read from the last one that was written.
On almost all compatible iOS devices, this works fine (iPhone 7, 7+, X & the iPad 9.7 2017/5th Gen); however on the iPad 2018/6th Gen, I get an EXC_BAD_ACCESS when trying to read back the address from the pinned handle, after pinning the array.
Minimally viable MonoBehaviour below:
using System;
using System.Runtime.InteropServices;
using n00dle;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.XR.iOS;
public class ExtractVideoFrameBytes : MonoBehaviour
{
CameraImage image;
private byte uvBytes;
private byte yBytes;
private byte uvBytes2;
private byte yBytes2;
private int width;
private int height;
private GCHandle m_PinnedYArray;
private IntPtr m_PinnedYAddr;
private GCHandle m_PinnedUVArray;
private IntPtr m_PinnedUVAddr;
private UnityARSessionNativeInterface iface;
private bool texturesInitialised = false;
private long currentFrameNumber = 0;
// Use this for initialization
void Start () {
UnityARSessionNativeInterface.ARFrameUpdatedEvent += UpdateFrame;
}
private void OnDestroy()
{
UnityARSessionNativeInterface.ARFrameUpdatedEvent -= UpdateFrame;
}
void UpdateFrame(UnityARCamera camera)
{
if (!texturesInitialised)
{
iface = UnityARSessionNativeInterface.GetARSessionNativeInterface();
Debug.Log("INITIALISING");
width = camera.videoParams.yWidth;
height = camera.videoParams.yHeight;
int numYBytes = camera.videoParams.yWidth * camera.videoParams.yHeight;
int numUVBytes = camera.videoParams.yWidth * camera.videoParams.yHeight / 2; //quarter resolution, but two bytes per pixel
yBytes = new byte[numYBytes];
uvBytes = new byte[numUVBytes];
yBytes2 = new byte[numYBytes];
uvBytes2 = new byte[numUVBytes];
m_PinnedYArray = GCHandle.Alloc(yBytes, GCHandleType.Pinned);
m_PinnedUVArray = GCHandle.Alloc(uvBytes, GCHandleType.Pinned);
texturesInitialised = true;
}
if (TryGetImage(ref image))
{
Debug.Log("Got an image...");
}
else
{
Debug.LogError("No image :(");
}
}
public bool TryGetImage(ref CameraImage cameraImage)
{
#if !UNITY_EDITOR && UNITY_IOS
ARTextureHandles handles = iface.GetARVideoTextureHandles();
if (handles.TextureY == IntPtr.Zero || handles.TextureCbCr == IntPtr.Zero)
return false;
if (!texturesInitialised)
return false;
long doubleBuffId = currentFrameNumber % 2;
++currentFrameNumber;
m_PinnedYArray.Free();
m_PinnedYArray = GCHandle.Alloc(doubleBuffId == 0 ? yBytes : yBytes2, GCHandleType.Pinned);
m_PinnedYAddr = m_PinnedYArray.AddrOfPinnedObject();
m_PinnedUVArray.Free();
m_PinnedUVArray = GCHandle.Alloc(doubleBuffId == 0 ? uvBytes : uvBytes2, GCHandleType.Pinned);
m_PinnedUVAddr = m_PinnedUVArray.AddrOfPinnedObject();
// Tell Unity to write the NEXT frame into these buffers
iface.SetCapturePixelData(true, m_PinnedYAddr, m_PinnedUVAddr);
// Now, read off the other buffers
cameraImage.y = (doubleBuffId == 0 ? yBytes2 : yBytes);
cameraImage.uv = (doubleBuffId == 0 ? uvBytes2 : uvBytes);
cameraImage.width = width;
cameraImage.height = height;
#endif
return true;
}
}
I don't think there's anything particularly weird with the code above, but would be keen to know if anyone can spot anything I've done wrong. Indeed, this is a stripped down version of the code used in Unity's experimental AR Interface to retrieve the camera frame.
Since this only occurs on a single device, I suspect it's a bug in the underlying native code (probably an extra free somewhere), so I've also logged this as an issue on the UnityARKitPlugin issue tracker; if I get a response, I'll update or delete this question accordingly.
Edit: The stack trace I see in XCode is given below:
c# unity3d augmented-reality arkit
c# unity3d augmented-reality arkit
edited Nov 26 '18 at 10:33
n00dle
asked Nov 26 '18 at 10:12
n00dlen00dle
4,3922545
4,3922545
add a comment |
add a comment |
0
active
oldest
votes
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53478871%2fexc-bad-access-when-getting-pinned-array-handle-using-unityarkitplugin%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53478871%2fexc-bad-access-when-getting-pinned-array-handle-using-unityarkitplugin%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown