MacDirectory magazine is the premiere creative lifestyle magazine for Apple enthusiasts featuring interviews, in-depth tech reviews, Apple news, insights, latest Apple patents, apps, market analysis, entertainment and more.
Issue link: https://digital.macdirectory.com/i/1488864
Photoshop Everywhere Though we were hoping for a formal debut, the web version of Photoshop is still in beta, but it’s getting better and better and it’s becoming more obvious where it can fit it with our workflows. Meanwhile, Photoshop for iOS gained Smart Background Removal and Content Aware Fill. Both this functions are adequately speedy on our Second Generation iPad Pro. Not to be outdone, with Photoshop’s desktop version, you just have to hover over an area to select a complex object, then delete and intelligently fill an area with a single click. Live Text from Illustrator is now editable, certainly making relations between the print and video/web sides of a shop more amenable. In Lightroom, AI-powered masking gets even smarter. It can detect multiple people in a frame, selecting any or all of them. Going even further, it can separate out specific physical attributes for enhancement. The Healing tool is now intelligently content-aware. Sure. You could always make a group of objects look like they are intertwined in Illustrator, cutting paths then moving then forward and backward. Any changes, of course, are major surgery. Adobe’s new Intertwine feature pegged the applause meter in the Microsoft Theater Tuesday morning. Select an area where two object intersect and they become like two links of a chain and virtually as easy to reposition. Those links can be “broken” just as easily if you want to try something else. This year, when it came to video, Adobe had a lot to shout about. Unlike a lot of new technology, Camera-to-Cloud is exactly what it says it is: an efficient and practical way to move videos and still photos over the internet quickly, in full quality, directly from the camera. It is in the form of an open API provided by Frame.io to any manufacturer who wants it. We had a chance to talk to one of its creators, Frame.io’s Michael Cioni later that day. A Significantly New Dimension The keynote presentation on Substance 3D was another indicator of where our future is heading. In recent years, the ability to create photorealistic 3D renderings of designs without having to build and photograph prototypes or samples has provided an unprecedented speed boost for design-focused businesses and industries. A product can go from concept to catalog without leaving a virtual workspace. According Adobe’s Chief Product Officer (and Behance co-creator) Scott Belsky, this has been a game-changer from Ben & Gerry’s to NASA. Substance 3D now brings the designers themselves into that same immersive, virtual workspace with very tactile VR modeling tools. Sculptors can sculpt in that workspace with the same tactile techniques they would have used in the real world—but with a few very nice extras like active symmetry, modifying one side is instantly reflected on the other. This is one of the awe-inducing features of a new app Scott introduced at the conference, Substance Modeler. You can don a VR headset and handgrips to mold the initial design with your hands and move back to the desktop for detailed control. The finished model can go into Substance Painter for materials and textures then to Substance 3D Stager to put it into its own world. If you don’t want your product to look shiny and new, Substance Painter can now intelligently age, scratch, fray, and even dent a Substance 3D model in ways that range from slightly worn to train wreck survivor. Though the Substance group are serious, professional products (and are sold as a separate subscription), they are designed to be easy to adapt to and use for those with experience with Adobe’s other applications. There are going to be a lot of creative opportunities for those who can pick up these skills.