Since we announced the bundle between Transcriptive and PowerSearch a few months back, our team has been working even harder to improve the plugin so users can make the most of having transcripts and search engine capabilities inside Premiere Pro. This means we are releasing Transcriptive 2.0.5, which fixes some critical bugs reported, and PowerSearch 2.0: a much faster and efficient version of our metadata search tool.
Having accurate transcripts available in Premiere is already a big help on speeding up video production workflows, especially while working remotely. (See this previous post about Transcriptive’s sharing capabilities for remote collaboration!) But we truly believe, and have been hearing this from clients as well, that having all the content in your video editing project – especially transcripts! – converted into searchable metadata makes it much easier to find content if you have large amounts of footage, markers, sequences, and media files. And this is why the PowerSearch and Transcriptive combo makes it much easier tofind soundbites, different takes of a script, or pinpoint any time a name or place is mentioned.
PowerSearch 1.0 was decently fast but could be slow on larger projects. Our next release makes use of a powerful SQL database to make PowerSearch an order of magnitude faster. The key to PowerSearch is that it indexes an entire Premiere Pro project, much like Google indexes websites, to optimize search performance. An index of hundreds of videos that used to take 10-12 hours to create is now indexed in less than an hour and the same database makes searching all that data significantly faster. Another advantage is the ability to use common search symbols, such as minus signs and quotes, for more precise, accurate searching. For editors with hundreds of hours of video, this can help narrow down searches from hundreds of results to a few dozen.
PowerSearch still returns search results like any search engine. Showing you the search term, the words around it, what clip/sequence/marker it’s in, and the timecode. Clicking on the result will open the clip or sequence and jump straight to the correct timecode in the Source or Program panel.
PowerSearch 2.0 can still be purchased separately and help your production even if you are getting transcripts from a different source or just want to search markers. However, it is now bundled with Transcriptive and you can get both for $149 while PowerSearch costs $99 on its own. So if you haven’t tried using PowerSearch and Trabscriptive together, give it a try! We are constantly working on Transcriptive to add more capabilities, reduce transcription costs, and improve the sharing options now available in the panel. Features like Clip Mode and the new Text Editor go beyond just transcribing media and sequences, and combining it with a much faster PowerSearch makes finding content much faster.
Transcriptive 2.0 users can use their Transcriptive license to activate PowerSearch. Trial licenses for both Transcriptive and PowerSearch are available here and our team would be happy to help if you need support figuring out a workflow for you and your team. Send any questions, concerns, or feedback to firstname.lastname@example.org! We would love to hear from you.
A lot of you have a ton of footage that you want to transcribe. One of our goals with Transcriptive has been to enable you to transcribe everything that goes into your Premiere project. To search it, to create captions, to easily see what talent is saying, etc. But if you’ve got 100 hours of footage, even at $0.12/min the costs can add up. So…
Transcriptive has a new feature that will help you cut your transcribing costs by 50%. The latest version of our Premiere Pro transcription plugin has already cut the costs of transcribing from $0.012 to $0.08. However, our new prepaid minutes’ packages goes even further… allowing users to purchase transcribing credits in bulk! You can save 50% per minute, transcribing for $2.40/hr or .04/min. This applies to both Transcriptive AI or Speechmatics.
The pre-paid minutes option will reduce transcription costs to $0.04/min which can be purchased in volume for $150 or $500. For small companies and independent editors, the $150 package will make it possible to secure 62.5 hours of transcription without breaking the bank. If you and your team are transcribing large amounts of footage, going for the $500 will allow you to save even more.
The credits are good for 24 months, so you don’t need to worry about them expiring.
You don’t HAVE to pre-pay. You can still Pay-As-You-Go for $0.08/min. That’s still really inexpensive for transcription and if you’re happy with that, we’re happy with it too.
However, if you’re transcribing a lot of footage, pre-paying is a great way of getting costs down. It also has other benefits, you don’t need to share your credit card with co-workers and other team members. For bigger companies, production managers, directors or even an account department can be in charge of purchasing the minutes and feeding credits into the Premiere Pro Transcriptive panel so editors no longer have to worry about the charges submitted to the account holder’s credit card.
Buying the minutes in advance is simple! Go to your Premiere Pro panel, click on your profile icon, choose “Pre-Pay Minutes” and select the option that better suits your needs. You can also pre-pay credits from your web app account by logging into app.transcriptive.com, opening your “Dashboard” and clicking on “Buy Minutes”. A pop-up window will ask you to choose the pre-paid minutes package and ask for the credit card information. Confirm the purchase and your prepaid minutes will show under “Balance” on your homepage. The prepaid minutes’ balance will also be visible in your Premiere Pro panel, right next to the cost of the transcription.
Applying purchased credits to your transcription jobs is also a quick and easy process. While submitting a clip or sequence for transcription, Transcriptive will automatically deduct the amount required to transcribe the job from your balance. If the available credit is not enough to transcribe your job, the remaining minutes will be charged to the credit card on file.
The 50% discount on prepaid minutes will only apply to transcribing, but minutes can be used to Align existing transcripts at regular cost. English transcripts can be imported into Transcriptive and aligned to your clips or sequences for free, while text in other languages will align for $0.02/min with Transcriptive AI and $0.04/min with Transcriptive Speechmatics.
Have you ever considered using Transcriptive to build an effective Search Engine Optimization (SEO) strategy and increase the reach of your Social Media videos? Having your footage transcribed right after the shooting can help you quickly scan everything for soundbites that will work for instant social media posts. You can find the terms your audience searches for the most, identify high ranked keywords in your footage, and shape the content of your video based on your audience’s behavior.
According to vlogger and Social Media influencer Jack Blake, being aware of what your audience is doing online is a powerful tool to choose when and where to post your content, but also to decide what exactly to include in your Social Media Videos, which tend to be short and soundbite-like. The content of your media, titles, video descriptions and thumbnails, tags and post mentions should all be part of a strategy built based on what your audience is searching for. And this is why Blake is using Transcriptive not only to save time on editing but also to carefully curate his video content and attract new viewers.
Right after shooting his videos, the vlogger transcribes everything and exports the transcripts as rich text so he can quickly share the content with his team. After that, a Copywriter scans through the transcribed audio and identifies content that will bring traffic to the client’s website and increase ROI. “It’s amazing. I transcribe the audio in minutes, edit some small mistakes without having to leave Premiere Pro, and share the content with my team. After that, we can compare the content with our targeted keywords and choose what I should cut. The editing goes quickly and smoothly because the words are already time-stamped and my captions take no time to create. I just export the transcripts as an SRT and it is pretty much done, explains Blake.
Of course, it all starts with targeting the right keywords and that can be tricky, but there are many analytics and measurement applications offering this service nowadays. If you are just getting started in the whole keyword targeting game, the easiest and most accessible way is connecting your In-site Search queries with Google Analytics. This will allow you to get information on how users are interacting with your website, including how much your audience searches, who is performing searches and who is not, and where they begin searching, as well where they head to afterward. Google Analytics will also allow you to find out what exactly people are typing into Google when searching for content on the web.
For Blake, using competitors’ hashtags from Youtube has been very helpful to increase video views. “One of the differentials in my work is that I research my client’s competitors on Youtube and identify the VidIQs (Youtube keyword tags) they have been using on their videos so we can use competitive tagging in our content description and video title. This allows the content I produced for the client to show when people search for this specific hashtag on Youtube,” he explains. Blake’s team is also using Google Trends, a website that analyzes the popularity of top search queries in Google Search across various regions and languages. It’s a great tool to find out how often a search term is entered in Google’s search engine, compare it to their total search volume, and learn how search trends varied within a certain interval of time.
When asked what would be the last thing he would recommend to video makers wanting to boost their video views on Social Media, Blake had no hesitation in choosing captions. “Social media feeds are often very crowded, fast-moving, and competitive. Nobody has time to open the video as full screen, turn the sound on and watch the whole thing, they often watch the videos without sound, and if the captions are not there then your message will not get through. And Transcriptive makes captioning a very easy process,” he says.
The struggle of making documentary films nowadays is real. Competition is high, and budget limitations can stretch a 6-year deadline to a 10 year-long production. To make a movie you need money. To get the money you need decent, and sometimes edited, footage material to show to funding organizations and production companies. And decent footage, well-recorded audio, as well as edited pieces cost money to produce. I’ve been facing this problem myself and discovered through my work at Digital Anarchy that finding an automated tool to transcribe footage can be instrumental in making small and low budget documentary films happen.
In this interview, I talked to filmmaker Chuck Barbee to learn how Transcriptive is helping him to edit faster and discussed some tips on how to get started with the plugin. Barbee has been in the Film and TV business for over 50 years. In 2005, after an impressive career in the commercial side of the Film and TV business, he moved to California’s Southern Sierras and began producing a series of personal “passion” documentary films. His projects are very heavy on interviews, and the transcribing process he used all throughout his career was no longer effective to manage his productions.
Barbee has been using Transcriptive for a month, but already consider the plugin a game-changer. Read on to learn how he is using the plugin to makea long-form documentary about the people who created what is known as “The Bakersfield Sound” in country music.
DA: You have worked in a wide variety of productions throughout your career. Besides co-producing, directing, and editing prime-time network specials and series for the Lee Mendelson Productions, you also worked as Director of Photography for several independent feature films. In your opinion. How important is the use of transcripts in the editing process?
CB: Transcripts are essential to edit long-form productions because they allow producers, editors, and directors to go through the footage, get familiarized with the content, and choose the best bits of footage as a team. Although interview oriented pieces are more dependent on transcribed content, I truly believe transcripts are helpful no matter what type of motion picture productions you are making.
On most of my projects, we always made cassette tape copies of the interviews, then had someone manually transcribe them and print hard copies. With film projects, there was never any way to have a time reference in the transcripts, unless you wanted to do that manually. Then in the video, it was easier to make time-coded transcripts, but both of these methods were time-consuming and relatively expensive labor wise. This is the method I’ve used since the late ’60s, but the sheer volume of interviews on my current projects and the awareness that something better probably exists with today’s technology prompted me to start looking for automated transcription solutions. That’s when I found Transcriptive.
DA: And what changed now that you are using Artificial Intelligence to transcribe your filmed interviews in Premiere Pro?
CB: I think Transcriptive is a wonderful piece of software. Of course, it is only as good as the diction of the speaker and the clarity of the recording, but the way the whole system works is perfect. I place an interview on the editing timeline, click transcribe and in about 1/3 of the time of the interview I have a digital file of the transcription, with time code references. We can then go through it, highlighting sections we want, or print a hard copy and do the same thing. Then we can open the digital version of the file in Premiere, scroll to the sections that have been highlighted, either in the digital file or the hard copy, click on a word or phrase and then immediately be at that place in the interview. It is a huge time saver and a game-changer.
The workflow has been simplified quite a bit, the transcription costs are down, and the editing process has sped up because we can search and highlight content inside of Premiere or use the transcripts to make paper copies. Our producers prefer to work from a paper copy of the interviews, so we use that TXT or RTF file to make a hard copy. However, Transcriptive can also help to reduce the number of printed materials if a team wants to do all the work digitally, which can be very effective.
DA: What makes you choose between highlighting content in the panel and using printed transcripts? Are there situations where one option works better than the other?
CB: It really depends on producer/editor choices. Some producers might want to have a hard copy because they would prefer that to work on a computer. It really doesn’t matter much from an editor’s point of view because it is no problem to scroll through the text in Transcriptive to find the spots that have been highlighted on the hard copy. All you have to do is look at the timecode next to the highlighted parts of a hard copy and then scroll to that spot in Transcriptive. Highlighting in Transcriptive means you are tying up a workstation, with Premiere, to do that. If you only have one editing workstation running Premiere, then it makes more sense to have someone do the highlighting with a printed hard copy or on a laptop or any other computer which isn’t running Premiere.
DA: You mentioned the AI transcription is not perfect, but you would still prefer that than paying for human transcripts or transcribing the interviews yourself. Why do you think the automated transcripts are a better solution for your projects?
CB:Transcriptive is amazing accurate, but it is also quite “literal” and will transcribe what it hears. For example, if someone named “Artie” pronounces his name “RD”, that’s what you’ll get. Also, many of our subjects have moderate to heavy accents and that does affect accuracy. Another thing I have noticed is that, when there is a clear difference between the sound of the subject and the interviewer, Transcriptive separates them quite nicely. However, when they sound alike, it can confuse them. When multiple voices speak simultaneously, Transcriptive also has trouble, but so would a human.
My team needs very accurate transcripts because we want to be able to search through 70 or more transcripts, looking for keywords that are important. Still, we don’t find the transcription mistakes to be a problem. Even if you have to go through the interview when it comes back to make corrections, It is far simpler and faster than the manual method and cheaper than the human option. Here’s what we do: right after the transcripts are processed, we go through each transcript with the interviews playing along in sync, making corrections to spelling or phrasing or whatever, especially with keywords such as names of people, places, themes, etc. It doesn’t take too much time and my tip is that you do it right after the transcripts are back, while you are watching the footage to become familiar with the content.
DA: Many companies are afraid of incorporating Transcriptive into an on-going project workflow. How was the process of using our transcription plugin in a long-form documentary film right away?
CB: We have about 70 interviews of anywhere from 30 minutes to one hour each. It is a low budget project, being done by a non-profit called “Citizens Preserving History“.The producers were originally going to try to use time-code-window DVD copies of the interviews to make notes about which parts of the interviews to use because of budget limitations. They thought the cost of doing manually typed transcriptions was too much. But as they got into the process they began to see that typed transcripts were going to be the only way to go. Once we learned about Transcriptive and installed it, it only took a couple of days to do all 70 interviews and the cost, at 12 cents per minute is small, compared to manual methods.
Transcriptive is very easy to use and It honestly took almost no time for me to figure out the workflow. The downloading and installation process was simple and direct and the tech support at Digital Anarchy is awesome. I’ve had several technical questions and my phone calls and emails have been answered promptly, by cheerful, knowledgeable people who speak my language clearly and really know what they are doing. They can certainly help quickly if people feel lost or something goes wrong so I would say do yourself a favor and use Transcriptive in your project!
Here’s a short version of the opening tease for “The Town That Wouldn’t Die”, Episode III of Barbee’s documentary series:
Recently, an increasing number of Transcriptive users have been requesting a way of using After Effects to create burned-in subtitles using SRTs from Transcriptive. This made us anarchists get excited about making a Free After Effects SRT Importer for Subtitling And Captions.
Captioning videos is more important now than ever before. With the growth of mobile and Social Media streaming, YouTube and Facebook videos are often watched without sound and subtitles are essential to retain your audience and make them watchable. In addition to that, the Federal Communications Commission (FCC) has implemented rules for online video that require subtitles so people with disabilities can fully access media content and actively participate in the lives of their communities.
As a consequence, a lot of companies have style guides for their burned-in subtitles and/or want to do something more creative with the subtitles than what you get with standard 608/708 captions. I mean, how boring is white, monospaced text on a black background? After Effects users can do better.
While Premiere Pro does allow some customization of subtitles, creators can get greater customization via After Effects. Many companies have style guides or other requirements that specify how their subtitles should look. After Effects can be an easier place to create these types of graphics. However, it doesn’t import SRT files natively so the SRT Importer will be very useful if you don’t like Premiere’s Caption Panel or need subtitles that are more ‘designed’ than what you can get with normal captions. The script makes it easy to customize subtitles and bring them into Premiere Pro. Here’s how it works:
Windows: C:\Program Files\Adobe\Adobe After Effects CC 2019\Support Files\Scripts\ScriptUI Panels
Mac: Applications\Adobe After Effects CC 2019\Scripts\ScriptUI Panels
4. Restart AE. It’ll show up in After Effects under the Window\Transcriptive_Caption
5. Create a new AE project with nothing in it. Open the panel and set the parameters to match your footage (frame rate, resolution, etc). When you click Apply, it’ll ask for an SRT file. It’ll then create a Comp with the captions in it.
Select the text layer and open the Character panel to set the font, font size, etc. Feel free to add a drop shadow, bug or other graphics.
7. Save that project and import the Comp into Premiere (Import the AE project and select the Comp). If you have a bunch of videos, you can run the script on each SRT file you have and you’ll end up with an AE project with a bunch of comps named to match the SRTs (currently it only supports SRT). Each comp will be named: ‘Captions: MySRT File’. Import all those comps into Premiere.
8. Drop each imported comp into the respective Premiere sequence. Double-check the captions line up with the audio (same as you would for importing an SRT into Premiere). Queue the different sequences up in AME and render away once they’re all queued up. (and keep in mind it’s beta and doesn’t create the black backgrounds yet).
Although especially beneficial to Transcriptive users, this free After Effects SRT Importer for Subtitling And Captions will work with any SRT from any program and it’s definitely easier than all the steps above make it sound and it is available for all and sundry on our website. Give it a try and let us know what you think! Contact: email@example.com
We’re excited to announce that Beauty Box Video 4.0 is now available for Avid and OpenFX Apps: Davinci Resolve, Assimilate Scratch, Sony Vegas, NUKE, and more. This is in addition to After Effects, Premiere Pro, and Final Cut Pro which were announced in April.
Beauty Box Video 4.0 adds real time rendering to the high quality, automatic skin retouching that Beauty Box is famous for. It’s not only the best retouching plugin available but it’s now one of the fastest, especially on newer graphics cards like the Nvidia GTX 980. We’re seeing real time or near real time performance in Premiere Pro, Resolve, and FCP. Other apps may not see quite that performance but they still get a significant speed increase over what was possible in Beauty Box 3.0.
Easily being able to retouch video is becoming increasingly important. HD is everywhere and 4K is widely available allowing viewers to see more detail on closeups of talent than ever before. This makes skin or makeup problems much more visible and being able to apply digital makeup easily is critical to high quality productions.
You can also incorporate masks to limit the retouching to just certain areas like cheeks or the talent’s forehead. (as can be seen in this tutorial using Premiere Pro’s tracking masks)
So head over to digitalanarchy.com for more info and to download a free trial and free tutorials on how to get started and more advanced topics. You’ll be blown away by the ease of use, high quality retouching, and now… speed!
All of our current plugins have been updated to work with After Effects and Premiere Pro in Creative Cloud 2015. That means Beauty Box Video 4.0.1 and Flicker Free 1.1 are up to date and should work no problem.
What if I have an older plugin like Beauty Box 3.0.9? Do I have to pay for the upgrade?
Yes, you probablyneed to upgrade and it is a paid upgrade. After Effects changed the way it renders and Premiere Pro changed how they handle GPU plugins (of which Beauty Box is one). The key word here is probably. Our experience so far has been mixed. Sometimes the plugins work, sometimes not.
– Premiere Pro: Beauty Box 3.0.9 seems to have trouble in Premiere if it’s using the GPU. If you turn ‘UseGPU’ off (at the bottom of the BB parameter list), it seems to work fine, albeit much slower. Premiere Pro did not implement the same re-design that After Effects did, but they did add an API specifically for GPU plugins. So if the plugin doesn’t use the GPU, it should work fine in Premiere. If it uses the GPU, maybe it works, maybe not. Beauty Box seems to not.
– After Effects: Legacy plugins _should_ work but slow AE down somewhat. In the case of Beauty Box, it seems to work ok but we have seen some problems. So the bottom line is: try it out in CC 2015, if it works fine, you’re good to go. If not, you need to upgrade. We are not officially supporting 3.0.9 in Creative Cloud 2015.
– The upgrade from 3.0 is $69 and can be purchased HERE.
– The upgrade from 1.0/2.0 is $99 and can be purchased HERE.
The bottom line is try out the older plugins in CC 2015. It’s not a given that they won’t work, even though Adobe is telling everyone they need to update. It is true that you will most likely need to update the plugins for CC 2015 so their advice isn’t bad. However, before paying for upgrades load the plugins and see how they behave. They might work fine. Of course, Beauty Box 4 is super fast in both Premiere and After Effects, so you might want to upgrade anyways. :-)
We do our best not to force users into upgrades, but since Adobe has rejiggered everything, only the current releases of our products will be rejiggered in turn.
We’ve finally got all of our installers updated to recognize Adobe’s Creative Cloud. All of our plugins worked in CC, but you needed to point the installer to the right directory. We weren’t finding it automatically. Now we are! :-)
Actually this only affects the Photoshop Mac installers. The Windows installers look for the last version of the app you installed. If that was CC, then that’s what it would find. The Mac installers look for every installation of the app, so we have to specifically tell it to look for each new version. Probably more info than you wanted, but for those of you, uh, enjoying Creative Cloud… all of our products will install easily.
We’ve come along way from Beauty Box Video 1.0, which was pretty slow. It’s now as fast as any other solution out there, and BB still offers the easiest and highest quality way of doing retouching for HD, 4K, and film. That said, it still requires a render and there are various things that can slow it down. It can really slow FCP X down if FCP isn’t configured correctly.
What should you expect speed-wise from Beauty Box?
A minute of HD video should take from 3-10 minutes to render out on a reasonably fast machine. So let’s discuss how to get those faster speeds. If you have a fast video card (say, the Nvidia 680 in an iMac) and are seeing really slow speeds, make sure you read to the end were we discuss the configuration file BB uses.
After Effects: Beauty Box will render faster in AE than any other host app. This is primarily because of how AE handles multiprocessing. It’s far better than any of the video editing apps. It requires a fair amount of RAM to really take advantage of, but it can run very fast. If it’s possible, we recommend doing the Beauty Box pass in AE, and then bring the intermediary file into your editing app to cut.
Final Cut Pro 7 & X: If you’re using FCP X, turn off background rendering. Background rendering works great with basic filters, but when you have something that’s render intensive like Beauty Box background rendering will bring FCP X to it’s knees. Also, turn off scrubbing. FCP will start caching frames and, again, start rendering multiple frames in the background which will really make FCP sluggish. Generally, we recommend either applying BB first, and then turning it off as you’re editing, OR applying it last. Applying it last is the preferred way. You can take your edit, create a compound clip, and then apply Beauty Box to the compound clip. In FCP 7, a compound clip is called ‘Nest’.
Premiere Pro: Similar in some respects to what happens in Final Cut Pro. It’s not a real-time effects, so it’ll prevent the Mercury engine from rendering in real-time. So, again, you want to apply Beauty Box either before you start editing (and turn it off while you edit) or apply it as the last step after editing and color correction (recommended).
Video Cards: Beauty Box is accelerated using OpenCL. This means it’ll get a massive speed boost from newer Nvidia and AMD video cards. In practice, this speed boost can vary quite a bit. We’ve run into more problems with AMD cards than Nvidia, so we recommend Nvidia cards if possible. Although, usually AMD cards are fine, so it’s not a huge deal. It does tend to be a bit more of a problem on the Mac where Apple creates the drivers. The AMD drivers tend to be more problematic than Nvidia’s. Regardless of which video card you have, we recommend getting the most recent Mac OS (and staying current with updates). Apple rolls driver fixes into the latest OS, so if you’re using an older OS, it’s potentially a problem. If you’re on Windows, you can just download the latest drivers, so it’s less of an issue.
What video card to get?
We still like the Nvidia GeForce GTX 570 as being the best price/performance option out there. For video applications, the Quadro cards don’t offer a lot of benefits. They tend to be slower and you’re paying for features that are more applicable to engineering/3D apps. If you do a lot of 3D work, the Quadro might be a better choice (I don’t do much 3D, so I can’t comment on that). The newer GeForce cards like the GTX 680 and GTX Titan are great, but don’t necessarily offer the speed boost to justify the extra cost. They are faster, so if you’re looking for the absolute fastest card then the Titan or GTX 690 is a great choice. Both cards require a ton of power, so make sure you’ve got a small nuclear power plant as your power supply.
OpenCL Configuration File
Beauty Box creates a special configuration file for the video cards in your system. This makes a file that you can send us that helps us troubleshoot any problems and it also keeps track of whether a given video card is crashing when used with BB. If the video card is always crashing this is a good thing. However, sometimes you’ll have a random one-off crash and BB will disable OpenCL. This will cause a dramatic slowdown in rendering. The solution is to delete the configuration file. BB will then recreate a default file next time it starts and rendering speeds will be back to normal. But you need to know where it is to delete it, so here are the locations:
The new version of Beauty Box Video for After Effects, Premiere, and Final Cut Pro 7/X is available for purchase or you can download the trial version. We’ve added a number of great new features, first and foremost is greatly improved automatic masking. This allows us to more accurately identify the skin tones and track them throughout the video clip. This means the retouching that Beauty Box does looks better than ever. Here’s an example:
No automatic mask is perfect, we’re still picking up a bit of the background, but it’s much improved from 1.0
The other big new feature is the addition of preset Styles. It ships with 35 different styles to give your video a wide variety of different looks from a warm glow to a ‘day to night’ look. These are modifiable, so you can adjust the amount of smoothing up or down.
We’ve also improved the shine removal, improved the OpenCL support, so it should be faster on most cards, and made a bunch of other small improvements and bug fixes.
It’s a great upgrade and until June 30th it’s only $59 for Beauty Box 2.0 users ($99 for 1.0 users). New licenses are also on sale for only $149 (save $50!).
Oh, and if you’re wondering what happened to OpenFX support… we’re adding NUKE support and the OpenFX version will be released in a couple weeks. At the same time, we’ll also be releasing a brand new version for Avid systems! You can download the beta of both the OpenFX and Avid builds here.
Around this time of the year, you start seeing a lot of talk about what’s going to be released at NAB. It’s always interesting to look at some of the larger trends that are out there. Of course, what’s trending for Digital Anarchy is Beauty Box 3.0. The photo version just got released (see below) and the video version is not far behind. But beyond that…
All the speed tests we’ve done with Beauty Box on Windows show the Nvidia GeForce video cards to outpace their much more expensive cousins, the Quadros, significantly. A GTX 570 (~$270) is about 25-30% faster than a Quadro 4000 ($800).
Since Beauty Box can involve some render time, we’ve wished that Apple would authorize one of the newer GeForce cards for the Mac. No such luck. So we’re tired of waiting. We took a stock PNY GeForce 570 and put it into our MacPro. And lo! It works!
So… what’d we do and what are the caveats? This was not a 570 with ‘flashed’ ROM. This was just a straight up 570 which we use in one of our PC machines. Nothing fancy. We did need to download a few things:
– If you’re using Premiere you need to update the cuda_supported_cards.txt file to add the name of the video card. In this case it would be: ‘GeForce GTX 570’ To do this, you need to go to the Premiere.app file, right+click on it and select ‘Show Package Contents’. Once you do that, this is what you’ll see:
So you’ve got two (or 20) computers and you want to use Beauty Box (or whatever) on all of them.
This is always a tricky thing for software developers. On one hand we realize many folks have multiple machines and since they’re only one person, they can only use one machine at a time. We would like to allow them the flexibility of having it on a couple machines. On the other hand, if you’re a studio with multiple machines and multiple people we think that if our software is good enough to be installed and used on all those machines we should be paid for it. Making sure that happens sometimes gets in the way of how a single user is using our plugins.
When you buy a license of our software, you’re buying it for one user. If you’re a company with multiple machines and multiple artists/editors using those machines, then there’s not much gray area and you need a license for each computer being used. We offer pretty good volume discounts and site licenses for this type of situation, you can contact firstname.lastname@example.org for pricing.
There is one big exception to this… if you’re using After Effects’ network rendering. You do not need extra licenses for After Effects render nodes. You can install Beauty Box on as many render nodes as you want for free.
People (and, no, companies are not people. I don’t care what the Dread Pirate Roberts says)
If you’re just one person with multiple machines then there’s some gray areas. The software can be installed on a couple machines, but we use the internet to determine if the plugin is being used on multiple computers at the same time. So if you have a desktop and a laptop and you’re using one or the other depending on whether you’re at home or at the office, no problem. You’re good to go.
However, if you’re in your studio/office and trying to use both machines for rendering/editing at the same time, you may run into problems. If so, here’s what you can do:
1) You can purchase a second license. We do offer discounts for second licenses. Contact email@example.com.
2) Use the second machine as an After Effects render node. As mentioned above, you can use Beauty Box on as many render nodes as you want for free. So if the machine is just being used to process frames sent to it from another machine you shouldn’t have any problems.
3) Our licensing is set up so that you can install on two machines, they just can’t be in use simultaneously. The way we check this is via the internet. So if you disable the internet connection on one machine, then we can’t check it. This is a hack and technically violates the license. However, since the spirit of the license is for one user, as long as it’s the same person using the machines we’re ok with it.
4) Render out the Beauty Box clip on one machine while working on another part of your project on the second machine. BB just gets watermarked on the second machine, so it’s still usable.
Like most of you, we’re running a small company. We try to be as flexible as possible, but if you’re making money using our software we would like you to buy the correct number of licenses. Please support the companies that make the tools you use and that help you be successful.
We’re about to release a free update to Beauty Box Video (2.0.4… look for it next week) and figured it was time to talk about GPUs again. We’re seeing 500-800% speed increase using the GPU on newer graphics cards, especially Nvidia boards which seem to be more stable than AMD or Intel.
In case you missed it, last week on Halloween we released a free filter called Ugly Box! The blog post is a little late for Halloween (although they are celebrating it in New Jersey today), but if you’re tired of all the election nonesense, there’s still plenty of time to use it to make Glenn Beck more interesting.
I think one of the biggest surprises we had when we released Beauty Box 1.0 was that people kept asking us if it could make people look worse. Considering how much detail you can see on HD and how bad some people looked on HD, I didn’t really think there’d be a need for a filter to do that. But… we give our customers what they want though…
With Beauty Box 2.0, you could set Skin Detail Smoothing to a negative number resulting in, yep, Ugliness! It takes the skin texture, amplifies it and sharpens it making your talent either look a bit older or flat out hideous depending on their skin and the settings. Ugly Box, the fitler we’re releasing for free, let’s you use that aspect of Beauty Box. It’s a bit of a one trick pony, you don’t have all the control you do with Beauty Box, but it can definitely make the folks in your videos look a lot worse.
Anyways, all the details are below, so download it for free and have fun with it! I figured it’d be a great Halloween treat for all you visual effects artists and editors doing last minute scary videos (or election videos…). ;-)
From time to time, our customers are gracious enough to send us their amazing work in which they have been using our plugins. Pete Saunders is one of those customers.
Earlier this week Peter offered up his work for us to use in our 3D Invigorator gallery section. He is the Company Director for After Hours Creative, a small design studio in England that specializes in high impact graphics.
(Above) “Created for an individual looking for a totally unique card design. We decided to create a romantic feel to the word “Love” with just colours, textures and a small selection of well placed images. The combination of 3D text created in 3D Invigorator, textures in Texture Anarchy and images of small roses achieved the look and feel perfectly.” -Saunders
We were excited to receive an email from Aaron Brenner, of the LA Kings hockey team, letting us know that they had used Beauty Box Video on a high profile piece they were doing.
An interesting aspect to Beauty Box Video is that it’s difficult to get people to admit they are using it. A LOT of production companies have bought and loved the software but they’re a little shy about singing its praises publicly. Their actor and actress clients aren’t too keen about wanting fans to know they used software to make them (more) beautiful.
This wasn’t a problem for the subjects of Aaron’s production for the Kings. It’s a behind the scenes video of the photo shoot of the LA King’s Ice Girls calendar! Some very beautiful girls who you wouldn’t think would need much retouching.
(Click on the image above to be taken to the King’s site and see the video.)
The example above shows one before/after image with only a little skin correction needed. The example below shows a more extreme example of skin smoothing. The plugin is great for both kinds of situations because it always gives a natural look. Continue reading Beauty Box Photo is a Smash!→
I recently spoke with William Branson III surrounding our exciting new product release of Beauty Box Photo and were reminded of how much I love his artwork. He is an amazing Portrait Artist and his images really push the limits of photograph vs painting. Check out his work here: http://www.wbranson.com/
Since one week is a decade in internet time, I’m seeing this February post about green screening an eternity too late. But I still think it’s interesting, as is most of the stuff that I find through BoingBoing.net. The movie shown below is the 2009 Virtual Backlot Reel from StargateStudios.
It’s fascinating — and maybe a bit disturbing — to realize that mundane scenes in TV shows are now regularly treated as visual effects events. Digital Anarchy first developed Primatte Chromakey, our Adobe Photoshop plugin for green screen masking, in mid-2005. At the time, we had to spend a lot of time simply explaining to photographers what ‘green screen’ meant. Five years later, green screen is a recognized entity with information accessible on non-pro sites like ehow.com. The convergence continues!
3D tends to be a new thing for most people, so we get a lot of questions about it. Here are the most frequently asked ones, the basic answers, and links to our video tutorials that explain the answers in depth.
1: My Illustrator files won’t import, why?
Save Illustrator Files as Invigorator 10 files. Invigorator can’t read vector files saved out in newer Illustrator formats. So save your files as Illustrator 10 files to ensure they’ll come into 3D Invigorator. See the tutorial on Illustrator files:
2: My complex Illustrator file isn’t coming in correctly. Why?
When we first launched Primatte, we tested a variety of ‘greenscreen’ backgrounds to determine what to recommend. Paper backgrounds turned out to be worst and we had the best luck with a velcro/foam material.
Well… apparently not all paper backgrounds are made equal!
I don’t remember who made the paper background we initially tested. But it was awful. Very reflective and prone to hot spots. We figured all paper would have the same problems. After listening to a talk by another company that does greenscreen software, I decided to revisit this and give Savage Paper’s ‘tech green #46’ a try.
So how’d it fare vs. the foam materail we’ve been recommending since day 1?
I had a very nice email exchange with customer John Gunmann a few months ago. Meant to blog immediately about the talk but other conversations kept piling on top. Figure this topic will be a wonderful final post of 2009. Especially since John was so pleased that he told me to buy a top shelf drink on the Digital Anarchy tab, which perhaps I will do tonight for New Years Eve.
Last week, we received an order for our Backdrop Designer plugin. On that same day, we received a request to resend a previous purchase of the Backdrop Designer plugin. Since I am Digital Anarchy’s customer service person and our order fulfillment department — as well as blogger extraordinaire — I recognized the name in both emails. The customer already owned Backdrop Designer but was purchasing it again.
This is one of the reasons we hand fill orders. It’s not uncommon for someone to forget he already owns a product. It’s also not uncommon for someone to think she bought the product when really she only downloaded the demo. This was the basis of two different support calls last week.
(By the way, I feel this theory should extend to cheesecake. I should be able taste its sweetness just thinking about eating cake, and therefore save myself the calories.)
Royalty-free cheesecake served up from www.freefoto.com. Cool stuff on that site.
A few days ago, I received a great email from a new customer named Mark Edwards. He wrote us a nice note about his purchase of ToonIt! Photo, which is our Adobe Photoshop cartooning plugin, and attached some images to his email.
Mark said, “Thanks for the cool tool. After only a few minutes of playing around with it, I decided to buy it (original and toonit versions of one picture attached). Love it!”
When we first start selling 3D Invigorator there were some questions about when you would ever need such a tool. But for designers it can give you some really interesting options without going into a 3D program. GoMediaZine, an online magazine for designers, recently had a tutorial on some very cool typographic effects. In the tutorial they use Cinema 4D, an excellent 3D program. However, they could have used 3D Invigorator.
The 3D Invigorator is our newest Adobe Photoshop plugin for creating 3D logos and other fun 3D graphics. We’ve recently posted some great video tutorials that explain this powerful plugin. Watch new movies that explain the Material Editor, your one-stop shop for prettying up that 3D model, and its interesting texturing options for Textures and Bump Maps.
We have also added a movie that talks between working in 3D Invigorator’s environment vs. making use of Photoshop’s new 3D capabilities? The difference is pretty vast and we explain it all in this video tutorial.
Just read an article on one of my favorite industry news sites, www.studiodaily.com, which is related to Studio Monthly magazine. It’s about a new SciFi film that uses relatively low budget techniques to tell a story about the futurism of Mexico. The film is Sleep Dealer and the director is Alex Rivera.
I always enjoy reading about people’s hardware and software choices and moreso about their creative decisions. But what I really enjoyed about this article was the final interview question asked of Rivera.
Every time I write a manual for our company, I inevitably stumble upon the need to explain some basic terms. ‘Basic’ isn’t really the correct descriptor because it often implies that something is easy to understand.
For instance, this past week I was writing about a parameter in our ToonIt! Photo plugin. The control is called Lighter Type and the way to describe its Lighter1 option is to say that Lighter1 alters the ‘gamma’ of the source image. Well, I know that ‘gamma’ refers to colors but whew, I get completely lost after that.
I’m doing some product development for our ToonIt! Photo plugin, and wound up playing with some personal photos as source material. Looking at my cat is typically more interesting than iStock and this is a great photo to cartoon. The subjects’ faces are aimed at the viewer and their facial details are very clear. I also like that the background is blurred out in the original. We were photographed in my kitchen and those kinds of environments often don’t look all that interesting, even as a cartoon.
Here’s the result with the ToonIt! defaults rendered out of Photoshop:
Yesterday was an exciting one as Digital Anarchy branched out into a new host application: the wild world of Aperture. Our Photoshop plugins ToonIt! Photo and Knoll Light Factory are now available for use in Aperture.
Our president, Jim Tierney, had the disadvantage of working on this product release remotely… from Hawaii. Here are some of his hard-earned Maui test shots for Knoll Light Factory.
Digital Anarchy launched our newest product, ToonIt! Photo, just before Christmas. It’s a fun new Adobe Photoshop plugin that’s cartooning software for photos and other graphics.
Unfortunately, the ToonIt! manual took a week longer than the product release. It’s always the little stuff, like forgetting to plug in in the toaster, that trips me up. You can get the ToonIt! Photo manual from here. I apologize for the wait. Writing manuals is _almost_ as difficult as reading them.
Last month, Digital Anarchy had some difficulty with our server, store and site… shudder… and had to change vendors unexpectedly. I’ve been combing through our media ever since, trying to find content that didn’t properly survive the transition.
Which caused me to stumble upon one of my favorite artists in our Primatte Chromakey gallery. John Riley, Ph.D., is a physicist and associate professor who initially contacted Digital Anarchy about some graphics work for which he was using Primatte, an Adobe Photoshop plugin for blue/greenscreen masking.
I read today on the Studio Photography blog that Polaroid will stop producing its instant film. The article rounds up some interesting vignettes about Polaroid aficionados and why they love the medium, but here’s the meat of the news:
“Sixty years after Polaroid introduced its first instant camera, the company’s iconic film is disappearing from stores. Although Polaroid says the film should be available into 2009, this is the final month of its last production year. Eclipsed by digital photography, Polaroid’s white-bordered prints — and the anticipation they created as their ghostly images gradually came into view — will soon be things of the past.”
This discontinuation feels quite sad. Although I don’t use Polaroid anymore, I remember years back when my friends and I would take Polaroids of each other at parties and tape the photos to a window or sliding glass door. By the end of the evening, we’d have a timeline of the party and all of the silly and sweet things that had occurred.
hmm, and perhaps my statement multiplied by 1 or 3 million is why the instant film is being discontinued. Memories don’t always translate into dollars. Also, while Polariod is nostalgic to me and perhaps the generation above me, it’s not to someone in their teens or 20’s.
Seems to me that if Polaroid did some marketing and made that medium feel relevant, then it could still sell okay. But I guess they’re a big company and it’s just not worthwhile to their bottom line.
Yesterday we released our first new product in awhile. It’s an Adobe Photoshop plugin called ToonIt! Photo and the software creates absolutely gorgeous cartoons from photographs and other still images.
As I was working on the material for our launch, to keep myself amused late into the night, friends emailed me close-up photos of themselves. I would run a quick toon on the photo and sent the new image back. That was fun to do and a new word was born: ‘cartoodling’. It’s when you (ok, me) play around with a cartoon filter and just whittle and doodle the time away…
… has been rather exaggerated. Ok, way over-exaggerated.
Layoffs happen at big companies. When things are great you tend to hire based on great expectations. It’s better to have too much capacity and grow into it than to be overwhelmed. The flip side is when things slow you need to trim down and unfortunately, that means layoffs. An 8% reduction in workforce really isn’t something that should be seen as that concerning. At least, from an end users perspective… for the folks getting laid off… yeah, it sucks. Although Adobe has been known to give nice severance packages.
Adobe laid off 150 people in 2001, and Macromedia laid off 170, which was 10% of the staff at the time (which was partially because of a merger, but if things had been booming I don’t think it would have been nearly as high). So layoffs are hardly unprecedented. If Adobe and Macromedia survived the dot.com implosion, I’m sure they’ll do ok this time around.
The other factor in all this is that it’s incredibly difficult to get loans or other financing right now. You would think (and this is WHOLE other rant) that with the banks getting all this taxpayer money they’d be back in business making loans. But no. Things are tighter now than they were 6 months ago.
So… companies like Adobe really need to conserve the cash they have on hand. They don’t have as much flexibility in ‘waiting and see’.
This was, at least from Adobe’s perspective, a smart and necessary thing to do. Digital Anarchy is dependent on Adobe products, and I’m not reading anything into this other than just the normal reaction to the reduced expectations that happen in a recession (We’ve been in one for about 9-12 months at this point).
For Digital Anarchy, we’re proceeding much like Adobe (minus the layoffs… we don’t have enough people as it is :-), cutting the costs we can and continuing to release products. We’ve got four products on schedule to be released over the next 3-4 months. With any recession you can’t stop investing in new products, but you do need to watch your costs very carefully. That’s all Adobe is doing.
Jim Tierney www.digitalanarchy.com
Filters for Photography & Photoshop
f/x tools for revolutionaries
We’re all pretty excited around here at Digital Anarchy about our upcoming product release. Usually we don’t talk about products until they are released, but we pre-announced this product earlier in the year — err, a few times earlier in the year — and it’s finally hitting the market this week.
The product is ToonIt! Photo and you can see images, well, right here. You can also check out footage showing off last year’s release of ToonIt for video apps. The medium is different but the underlying software is the same.
Even though I am working through the weekend, I’m having a blast writing our manual and web pages and tutorial scripts. After all, how can it NOT be fun to turn yourself (and mom) into a cartoon?
Wherein Jim Tierney rants and opines about After Effects, Premiere Pro, Final Cut Pro, and other nonsense