A new version of Cineversity has been launched. This legacy site and its tutorials will remain accessible for a limited transition period

Visit the New Cineversity
   
4 of 11
4
CV-AR: Support
Posted: 11 November 2018 06:19 AM   [ Ignore ]   [ # 46 ]  
Avatar
Total Posts:  54
Joined  2009-02-16

Hi everyone,

Is there anyone here that would be interested in being a beta tester for the next release of CV-AR? If you are interested then please PM me.

Thanks,
Kent

Profile
 
 
Posted: 11 November 2018 05:57 PM   [ Ignore ]   [ # 47 ]  
Total Posts:  9
Joined  2018-11-08

Hello there
Does the Plugin work with R20.030 on Windows 10 Pro 1809 and the actual iOS 12.1?
I can connect the iPhone -> see a grey box for the movie - no change in this state for minutes ...

Greez, Mike C.

Profile
 
 
Posted: 12 November 2018 11:45 AM   [ Ignore ]   [ # 48 ]  
Avatar
Total Posts:  54
Joined  2009-02-16

Hi Mike,

I can’t verify with Windows 10 Pro 1809 since I won’t be updating until I know it is stable. It seems like Microsoft has pulled that update and many users have been reporting problems with it.

As for working with R20.030 and Windows 10 17134 and iOS 12.1, yes it should be working without any issues. I will do another test later on today with a fresh install on the iPhone and R20.030 to verify.

Best regards,
Kent

Profile
 
 
Posted: 12 November 2018 02:17 PM   [ Ignore ]   [ # 49 ]  
Total Posts:  9
Joined  2018-11-08

Hi Kent,

Thanks for moving the Thread around and for your quick answers. I like the idea of using the iphone for face tracking. But shouldn’t/couldn’t there be a possibility that the customers can do the records wherever they like and then send those records to the animators which are able to import the trackings into their scenes?

Greez, Mike C.

Profile
 
 
Posted: 13 November 2018 02:55 AM   [ Ignore ]   [ # 50 ]  
Avatar
Total Posts:  54
Joined  2009-02-16
MikeC. - 12 November 2018 02:17 PM

But shouldn’t/couldn’t there be a possibility that the customers can do the records wherever they like and then send those records to the animators which are able to import the trackings into their scenes?

Hi Mike,

I am working on that particular issue right now. More info to come later this year. If you wish to be a beta tester then just send me a PM.

Best regards,
Kent

Profile
 
 
Posted: 15 November 2018 12:28 AM   [ Ignore ]   [ # 51 ]  
Total Posts:  1
Joined  2016-05-26

I know this is for reporting bugs, but would like to share my results so far in my recent test

https://www.youtube.com/watch?v=Kxb66xr-Bxg

Looking forward to beta testing in the near future smile

Captured on an iPhoneXS and using C4D R20.  Thanks developers for your hard work, this is so desperately needed in C4D smile

Profile
 
 
Posted: 26 November 2018 09:09 AM   [ Ignore ]   [ # 52 ]  
Total Posts:  1
Joined  2017-10-02

System: OSX 10.14.1 Mojave, R18
Tried to install with Toolbox, “CV-AR” doesn’t return any search results.
Attempted to instal manually by placing downloaded “cv-ar” package in Plugins folder (as defined in preferences) and restarted C4D (several times)
Neither install methods work. I don’t see CV-AR in the C4D > Plugins menu.

Please advise.

Thanks—Steven

Profile
 
 
Posted: 26 November 2018 10:17 AM   [ Ignore ]   [ # 53 ]  
Avatar
Total Posts:  54
Joined  2009-02-16

Hi Steven,

CV-AR requires R19 or higher.

Kent

Profile
 
 
Posted: 06 December 2018 07:33 AM   [ Ignore ]   [ # 54 ]  
Total Posts:  1
Joined  2017-12-26

Sorry if this question comes across multiple times. Safari doesn’t work with the forums for some reason. Anyways…
What’s the best way to convert CV-AR recordings to 24p so I can see what my work will look like prior to export? Right now if I change it in project settings audio will adapt to the new rate but video will not, meaning the face mesh recording plays back at 20% slower speed.

I can change it in render settings but it will result in some things potentially not working right (like eye blinks).

Thanks!
John

Profile
 
 
Posted: 06 December 2018 07:50 AM   [ Ignore ]   [ # 55 ]  
Avatar
Total Posts:  54
Joined  2009-02-16

Hi John,

I suggest using the Point Cache Tag.

- Right click on the Face Mesh and add Character Tags->Point Cache
- Press Store State
- Check the Enable and PSR checkboxes
- Press Calculate
- Then do the same steps for each eye (but the eyes will need to be made into polygon objects first).
- Now you can move the Face Mesh out from the Capture Object and delete it.
- One issue now though is that the Point Cache doesn’t cache changing UVs, so the texture will not be updated correctly. You will need to lock your UVs to a single texture. On the Capture Object check the “Use Single Frame Texture” option and choose a frame for “Specific Texture Frame”. Now go to the data for your capture by right clicking on it in the CV-AR dialog and choose Show Files. Then drag in the texture number you just chose, it will have the same number at the end _XXXX.jpg. Now drag this texture onto your material to replace the Face Shader.
- You should now be able to change the Project settings to any FPS you require via the Attribute Manager->Mode->Project->FPS setting. In your case 24. This will also need to be set in the Render Settings as you have already done.
- Then back on your PointCache set the Scale to be 30/24*100. IE 125%. This will then speed up the animation again to match your lower fps.

Best,
Kent

Profile
 
 
Posted: 06 December 2018 08:02 AM   [ Ignore ]   [ # 56 ]  
Avatar
Total Posts:  54
Joined  2009-02-16

Hi again John,

Actually I think I gave you the wrong advice there.

I was thinking of a different situation.

Let me have a look into this a bit more and see if I can find the correct solution for you.

Best,
Kent

Profile
 
 
Posted: 06 December 2018 08:09 AM   [ Ignore ]   [ # 57 ]  
Avatar
Total Posts:  54
Joined  2009-02-16

Hi John,

I updated my previous post. It was correct after all. I just need to add in one last step, which was to set the Scale of the Point Cache so that it speeds up the animation correctly.

Let me know how you get on.

Best,
Kent

Profile
 
 
Posted: 11 December 2018 07:12 AM   [ Ignore ]   [ # 58 ]  
Total Posts:  1
Joined  2018-11-30

Hi, this is great and almost everything is working great on my iPhone Xs. I do have one issue- am I misunderstanding the purpose of Mouth Close? I would assume it is the opposite of Mouth open? My face capture imports correctly, but if I attach a result tag in xpresso to ‘Mouth Close’, I can’t seem to find a rhyme or reason to it, and my rigged model thats xpresso’d to the capture can’t either. In some instances it seems right, in other instances (say, my mouth is open) the data is way off.

But the imported face model is correct. I’m using the ‘Mouth Close’ as my primary mouth animator, is there a better way?

Profile
 
 
Posted: 11 December 2018 07:25 AM   [ Ignore ]   [ # 59 ]  
Avatar
Total Posts:  54
Joined  2009-02-16

Hi briwil

The Mouth Close represents the closing of the lips without any relation to the jaw position.

https://developer.apple.com/documentation/arkit/arfaceanchor/blendshapelocation/2928266-mouthclose

You can click on any of the blend shapes on the official ARKit documentation to see what they represent.

https://developer.apple.com/documentation/arkit/arfaceanchor/blendshapelocation

You may want try using Jaw Open instead of Mouth Close in your Xpresso setup.


Best,
Kent

Profile
 
 
Posted: 02 January 2019 11:39 PM   [ Ignore ]   [ # 60 ]  
Total Posts:  2
Joined  2017-08-01

Hi, Great work on this plugin Kent!

I would like to use this plugin on someone’s face that cannot be present to connect to the computer.
Is there a way they can use the iPhone app and export it to me to import?
I can’t seem to find an option to import data.

Many thanks for any help!

Chris

P.S. Happy New Year!

Profile
 
 
   
4 of 11
4