Adobe is in the early stages of allowing third-party generative artificial intelligence tools such as OpenAI’s Sora and others inside its widely used video editing software, the U.S. software maker said on Monday.
Adobe’s Premiere Pro app is widely used in the television and film industries. The San Jose, California, company is planning this year to add AI-based features to the software, such as the ability to fill in parts of a scene with AI-generated objects or remove distractions from a scene without any tedious manual work from a video editor.
Both those features will rely on Firefly, an AI model that Adobe has already deployed in its Photoshop software for editing still images. Amid competition from OpenAI, Midjourney and other startups, Adobe has sought to set itself apart by training its Firefly system data it has full rights to and offering indemnity to users against copyright claims.
But Adobe also said on Monday that it is developing a way to let its users tap third-party tools from OpenAI, as well as startups Runway and Pika Labs, to generate and use video within Premiere Pro. The move could help Adobe, whose shares have fallen about 20% this year, address Wall Street’s concerns that AI tools for generating images and videos put its core businesses at risk.
OpenAI has demonstrated its Sora model generating realistic videos based on text prompts but has not made the technology public or given a timeline for when it will be available. Adobe, which released a demonstration of Sora being used to generate video in Premiere Pro, described the demonstration as an “experiment” and gave no timeline for when it would become available.
Deepa Subramaniam, Adobe’s vice president of product marketing for creative professional apps, said that Adobe has not yet settled how revenue generated by third-party AI tools used on its software platform will be split up between Adobe and outside developers.
But Subramaniam said that Adobe users will be alerted when they are not using Adobe’s “commercially safe” AI models and that all videos produced by Premiere Pro will indicate clearly which AI technology was used to create them.
“Our industry-leading AI ethics approach and the human bias work that we do, none of that’s going away,” Subramaniam told Reuters. “We’re really excited to do is explore a world where you can have more choice beyond that through third-party models.”
The recent attack on the XZ Utils supply chain was not an isolated incident, but rather part of a larger social engineering campaign that sought to compromise numerous JavaScript projects, experts have warned.
In a joint blog post, the OpenSource Security Foundation (OSSF) and OpenJS Foundation said that the OpenJS Foundation Cross Project Council received “a suspicious series of emails” all similar to one another, and mentioning similar GitHub-associated emails.
In the message, the senders urged OpenJS to update one of its popular JavaScript projects to “address any critical vulnerabilities”. Furthermore, they asked to be made new maintainers of the projects – something that was apparently done in the XZ Utils supply chain attack.
False sense of urgency
The attacks were, fortunately, not successful, the blog adds, as none of these individuals were given any privileged access.
Still, maintainers should be wary of “friendly yet aggressive and persistent” people demanding maintainer status for different projects – especially people who are relatively unknown members of the community. Even people endorsing such individuals shouldn’t be fully trusted, as they are most likely “sock puppets” – people with fake identities all working towards the same goal.
Finally, the attackers will try to establish a false sense of urgency, all so that the maintainers drop their guard and grant them privileged access.
“These social engineering attacks are exploiting the sense of duty that maintainers have with their project and community in order to manipulate them,” the researchers warn. “Pay attention to how interactions make you feel. Interactions that create self-doubt, feelings of inadequacy, of not doing enough for the project, etc. might be part of a social engineering attack.”
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
XZ-utils, a set of data compression tools and libraries used by major Linux distros, was found vulnerable to CVE-2024-3094. The flaw was introduced to XZ version 5.6.0 by a pseudonymous attacker, and persisted throughout 5.6.1 as well. The discovery of the vulnerability pushed the release of Ubuntu 24.04 beta for a week.
Motorola has launched a trio of new phones to lead its smartphone range: the Moto Edge 50 Fusion, the Moto Edge 50 Pro and the Moto Edge 50 Ultra.
With a combination of specs that cover the upper and lower-ends of the mid-range phone arena, Motorola’s new Edge 50 series looks set to challenge models found among the best cheap phones, in addition to snapping at the heels of some of the best phones – notably the likes of the Google Pixel 8.
Starting with the entry-level Moto Edge 50 Fusion, this £349 / €399 (we don’t have pricing for other regions yet) phone comes with a 6.7-inch Full HD+ pOLED 144Hz refresh rate display, a Qualcomm Snapdragon 7s Gen 2 chipset matched with 8GB or 12GB of LPDDR4X RAM, and UFS 2.2 storage options of 128GB, 256GB and 512GB.
Charging comes in at a speedy 65 watts when wired, with the battery capacity rated at 5,000mAh. The rear camera array is made up of a 50MP main camera with optical image stabilization, and a 13MP ultra-wide camera with a 120-degree field-of-vision. The front-facing camera comes in at 32MP.
All in all, those are fair if not exactly standout specs for an affordable Android 14 phone. But to help the Moto Edge 50 Fusion stand out is a trio of colors: Forest Blue and the pastel-like Marshmallow Blue both in vegan leather, and a vegan suede finish in Hot Pink.
Going Pro
Coming in at £599 / €699, the Moto Edge 50 Pro makes a play for the higher-end of mid-range phones. It sports a 6.7-inch 144Hz pOLED display with a ”Super HD resolution” and HDR10+ certification, a Snapdragon 7 Gen 3 chip, up to 12GB of RAM and 512GB of storage. There’s 125W ‘TurboPower’ rapid wired charging for quick refueling and 50W TurboPower wireless charging for the 4,500mAH battery.
On the camera side, there are three rear lenses, with the main one sporting a 50MP sensor complete with optical image stabilization, laser autofocus, and a 3-in-1 light sensor. The second camera has the 13MP ultrawide of the Fusion. But unlike the cheaper phone, the Edge 50 Pro comes with a 10MP 3x optical zoom telephoto camera with OIS (optical image stabilization). Flip the phone around and you’ll be greeted by a 50MP front-facing camera.
Get the hottest deals available in your inbox plus news, reviews, opinion, analysis, deals and more from the TechRadar team.
Again, these are all specs one might expect from a phone at this price point. But the Edge 50 Pro arguably offers something neat in the form of two vegan leather-backed modes in ‘Luxe Lavender’ or ‘Black Beauty’ – basically a light purple and black respectively – and a ‘Moonlight Pearl’ marble polymer option, which either looks sophisticated or kitsch depending on your tastes.
Ultra flair
Topping out the trio of Edge 50 phones is the Moto Edge 50 Ultra. This £849 / €999 Android phone arguably has the Google Pixel 8 Pro in its sights, by taking a lot of the Edge 50 Pro but adding in a Snapdragon 8s Gen 3 chip, 16GB of RAM and 1TB of storage.
The Moto 50 Ultra’s main camera is a 50MP snapper like that of the Pro phone, but has slightly bigger pixels. It also makes use of a 50MP ultra-wide camera with larger pixels, and a 64MP telephoto camera with a 3x optical zoom. The front-facing camera has a 50MP sensor for people who love clear video calls and snapping selfies.
In terms of standout design, the back of the Moto 50 Ultra can be bought in vegan leather in the Pantone colors of Peach Fuzz and Forest Grey, with a ‘Nordic Wood’ option to offer a dose of natural materials to the Ultra phone.
Style and smarts
Speaking of Pantone, the cameras and displays of these three phones have been developed with help from the color-centric organization to help capture and display images with accurate colors and skin tones.
All three phones come with IP68 water and dust resistance, and have access to ‘Moto AI’ features that help the cameras with things like autofocusing and capturing motion, in addition to generating themes and wallpapers for the phones, and aiding in navigation and search queries. And with Google Photos being used as the default photo app for the phones, the Edge 50 series can all tap into the smart editing features including Magic Editor, Magic Eraser, Photo Unblur.
Finally, the phones have the redesigned Hello UX user interface built on Android 14. This brings in more personalization options, slicker gesture control and better security features, including the ability to set up a safe space for kids to use the phone with limited access to apps. But all of this is a relatively light touch over Android 14, meaning Google’s software design and features are still given room to shine.
First impressions
I got a closer look at the phones in person at a Motorola media event and came away quietly impressed. There’s both a playfulness and clear vision in the phones’ design, offering mid-range phones that, to my eyes at least, offer something a little different to the rather run-of-the-mill phones one gets in this segment of the mobile market.
The Moto Edge 50 Pro was the phone I spent the most time with, and it’s the model I feel threads the line neatly between specs, features and price. The Edge 50 Ultra has some flagship-chasing features but I feel its price is a tad too high and it’ll need to bring something special to the table in everyday use to take on the likes of the Pixel 8 Pro and OnePlus 12.
The Edge 50 Fusion didn’t grab my imagination in terms of specs and features, but its neat design and colors, combined with a friendly price tag, could make it a solid contender for entry-level mid-range phones, appealing to people who want a little more phone than cheaper options typically offer without paying much more.
Availability of the Moto Edge 50 series starts from today (April 16) with the Edge 50 Pro and runs into mid-May with the other models. But we’ll need to put these phones through their paces before we decide if they are new contenders in the mobile market and can help see Motorola once again rise to the prominence it had in the early 2000s.
AI roundup, April 16: The United Kingdoms in all aboard to draft country’s AI regulations on powerful tools such as OpenAI’s ChatGPT. The regulation will be set to restrict or eliminate the potential harm of the emerging technology. In other news, Microsoft will be investing $1.5 billion in a United Arab Emirates-based artificial intelligence company named G42. The partnership is approved by the U.S. and UAE governments. Check out similar AI news from today.
1. UK has started to develop AI regulations to eliminate the risks
The UK has started to draft its AI regulation which will be applied to the new emerging technologies and large language models. These regulations will also be enforced on the current and future AI tools. These AI regulations are expected to eliminate the potential risks of AI and its usage, according to a Bloomberg report. However, the timeline for when the regulations will be announced was not specified.
Microsoft announced that it will be investing $1.5 billion in a United Arab Emirates-based artificial intelligence company named G42. It is also reported that G42 will be leveraging Microsoft cloud services to swiftly run its AI applications. Sheikh Tahnoon as part of the G42 statement said, “Microsoft’s investment in G42 marks a pivotal moment in our company’s journey of growth and innovation, signifying a strategic alignment of vision and execution between the two organizations,” according to a Reuters report.
3. Agritech company Cropin Technology launches open-source AI model for farmers
Google-backed Agritech company Cropin Technology announced the ‘akṣara’ AI model to help farmers and the agricultural industry. It will provide climate-smart agriculture practices for 9 crops: paddy, wheat, maize, sorghum, barley, cotton, sugarcane, soybean, and millet in five countries in the Indian subcontinent. Krishna Kumar, Founder and CEO of Cropin said, “Domain-specific AI models for agriculture are expected to attract significant investments, offering a practical and economically viable approach to food systems transformation,” according to an Investing.com report.
4. DeepMind CEO to spend more than $100 Billion on AI
Google DeepMind Chief Executive Officer Demis Hassabis highlighted the company’s AI business and said that it will be spending more than $100 billion to develop artificial intelligence technology. Hassabis said, “We don’t talk about our specific numbers, but I think we’re investing more than that over time.” However, he did not give any specific how the company will spend the huge sum, according to a Bloomberg report.
5. Badiu’s Ernie Bot announced it has more than 200 million users
China’s Baidu revealed that its AI chatbot, Ernie Bot has more than 200 million users, making it one of the famous AI tools in the country. Baidu CEO Robin Li also highlighted that Ernie Bot’s application programming interface (API) was used 200 million times on a daily basis. However, globally, there is no competition between ChatGPT and the Ernie Bot, according to a Reuters report.
One more thing! We are now on WhatsApp Channels! Follow us there so you never miss any updates from the world of technology. To follow the HT Tech channel on WhatsApp, click here to join now!
OpenAI’s Sora gave us a glimpse earlier this year of how generative AI is going to change video editing – and now Adobe has shown off how that’s going to play out by previewing of some fascinating new Premiere Pro tools.
The new AI-powered features, powered by Adobe Firefly, effectively bring the kinds of tricks we’ve seen from Google’s photo-focused Magic Editor – erasing unwanted objects, adding objects and extending scenes – to video. And while it isn’t the first piece of software to do that, seeing these tools in an industry standard app that’s used by professionals is significant.
For a glimpse of what’s coming “this year” to Premiere Pro and other video editing apps, check out the video below. In a new Generative panel, there’s a new ‘add object’ option that lets you type in an object you want to add to the scene. This appears to be for static objects, rather than things like a galloping horse, but it looks handy for b-roll and backgrounds.
Arguably even more helpful is ‘object removal’, which uses Firefly’s AI-based smart masking to help you quickly select an object to remove then make it vanish with a click. Alternatively, you can just combine the two tools to, for example, swap the watch that someone’s wearing for a non-branded alternative.
One of the most powerful new AI-powered features in photo editing is extending backgrounds – called Generative Fill in Photoshop – and Premiere Pro will soon have a similar feature for video. Rather than extending the frame’s size, Generative Extend will let you add frames to a video to help you, for example, pause on your character’s face for a little longer.
While Adobe hasn’t given these tools a firm release date, only revealing that they’re coming “later this year”, it certainly looks like they’ll change Premiere Pro workflows in a several major ways. But the bigger AI video change could be yet to come…
Will Adobe really plug into OpenAI’s Sora?
The biggest Premiere Pro announcement, and also the most nebulous one, was Adobe’s preview of third-party models for the editing app. In short, Adobe is planning to let you plug generative AI video tools including OpenAI‘s Sora, Runway and Pika Labs into Premiere Pro to sprinkle your videos with their effects.
Get the hottest deals available in your inbox plus news, reviews, opinion, analysis, deals and more from the TechRadar team.
In theory, that sounds great. Adobe showed an example of OpenAI’s Sora generating b-roll with a text-to-video prompt, and Pika powering Generative Extend. But these “early examples” of Adobe’s “research exploration” with its “friends” from the likes of OpenAI are still clouded in uncertainty.
Firstly, Adobe hasn’t committed to launching the third-party plug-ins in the same way as its own Firefly-powered tools. That shows it’s really only testing the waters with this part of the Premiere Pro preview. Also, the integration sits a little uneasily with Adobe’s current stance on generative AI tools.
Adobe has sought to set itself apart from the likes of Midjourney and Stable Diffusion by highlighting that Adobe Firefly is only trained on Adobe Stock image library, which is apparently free of commercial, branded and trademark imagery. “We’re using hundreds of millions of assets, all trained and moderated to have no IP,” Adobe’s VP of Generative AI, Alexandru Costin, told us earlier this year.
Yet a new report from Bloomberg claims that Firefly was partially trained on images generated by Midjourney (with Adobe suggesting that could account for 5% of Firefly’s training data). And these previews of new alliances with generative video AI models, which are similarly opaque when it comes to their training data, again sits uneasily with Adobe’s stance.
Adobe’s potential get-out here is Content Credentials, a kind of nutrition label that’s also coming to Premiere Pro and will add watermarks to clarify when AI was used in a video and with which model. Whether or not this is enough for Adobe to balance making a commercially-friendly pro video editor with keeping up in the AI race remains to be seen.
YouTube is doubling down on its efforts to curb the use of ad blockers, expanding its crackdown to include third-party applications on mobile devices. In a recent update, the platform warned users that accessing videos via these ad-blocking apps might result in performance issues or error messages, specifically stating, “The following content is not available on this app.”
YouTube’s Previous Efforts to Combat Ad Blockers
This move comes as no surprise, given YouTube’s previous global initiatives to encourage viewers to either disable ad blockers or transition to its ad-free subscription service, YouTube Premium. The platform had already started disabling video playback for users detected with active ad blocking extensions.
The updated policy underscores YouTube’s commitment to ensuring content creators receive due compensation for their work. YouTube explicitly stated that its guidelines prohibit third-party apps from bypassing ads. This policy aims to safeguard the revenue streams for creators by preventing ad-free viewing, especially on mobile ad blockers like AdGuard. These apps allow users to watch YouTube content seamlessly without encountering ads by accessing the platform within the ad-blocking application.
YouTube also emphasised its strict adherence to the Terms of Service governing the use of its API by third-party apps. Any violations of these terms would lead to decisive actions being taken to protect the platform, its creators, and the audience.
In light of these developments, YouTube is once again advocating for its ad-free subscription service, YouTube Premium, as a viable alternative. While this might be disappointing for users who have grown accustomed to ad-free viewing via third-party apps, YouTube seems resolute in its stance against ad blockers. The platform’s unwavering commitment to this cause suggests that it will continue to evolve its strategies to combat ad blocking, ensuring a fair ecosystem for both creators and viewers.
As YouTube intensifies its efforts to enforce these policies, it remains to be seen how users will respond to these changes. While some may opt for YouTube Premium to enjoy an uninterrupted viewing experience, others might seek alternative ways to access their favourite content without ads. Regardless, YouTube’s latest move signifies its determination to preserve the integrity of its advertising model and support its vast community of creators.
One more thing! We are now on WhatsApp Channels! Follow us there so you never miss any updates from the world of technology. To follow the HT Tech channel on WhatsApp, click here to join now!
Showing how you really feel about your latest Microsoft Teams calls is set to finally be a lot more inclusive thanks to a new update.
The video conferencing service has revealed it is working on a change that will give users the option to select their preferred skin tone for Microsoft Teams reactions.
Although only a minor tweak to the platform itself, the change is an important signal from Microsoft Teams as it looks to offer a more inclusive environment for workers everywhere.
Microsoft Teams reactions
First introduced in January 2021, Microsoft Teams reactions give users a quick and easy way to show their approval (or not) to chat messages and on calls, covering a range of emojis and symbols such as a thumbs up, frowning face or even a love heart.
Previously, only a single skin tone option has been available, however Microsoft’s new update will now change all that.
In an entry on the official Microsoft 365 roadmap, the company notes, “This preference will be applied to all emojis and reactions in chats, channels, and desktop/web meetings, allowing users to express themselves more authentically in conversations.”
The feature is currently listed as being “in development”, which a scheduled rollout start date of June 2024. It is set to be available for users across the world, on all platforms, including Windows, Mac, Android and iOS.
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
The update is the latest in a series of changes recently released for Microsoft Teams allowing users to show a little more personality in their calls and chats.
This includes a “unified fun picker” which brings together stickers, GIFs and more into a single location, hopefully making it a lot quicker to bring some joy to your Microsoft Teams calls and meetings, meaning you will no longer need to search around for the best reaction or emoji to brighten up a meeting.
Microsoft Teams has also launched virtual avatars, offering users more customization options while also livening up the look and feel of calls, alongside new visual effects including animated frames and video hue altering, on top of existing tools that allow users to blur backgrounds and soften the video feed to mask and obscure blemishes.
Something seriously weird is happening in the wilderness of Wyoming – and if you thought things were pretty odd in season one of Outer Range, then it looks like Outer Range season two will be stranger still. The Josh Brolin-starring sci-fi western show has got a brand new and quite detailed trailer, which you can see below.
The first season ended pretty explosively, and there’s not long to wait until the second season’s available to stream: it’s coming on May 16, 2024. And hopefully it’ll answer some of the many questions left hanging at the end of season one.
If you haven’t seen Outer Range yet, it’s one of the more ambitious shows you can stream right now. It’s a little bit western, a lot sci-fi, a murder mystery and a family drama too. It starts with the discovery of a massive hole in the ground, a hole from which all kinds of bizarre things emerge. That may not give much away, but it’s one of the best Prime Video shows we’ve seen, so we’re excited to see how season two will match up.
Outer Range Season 2: what to expect
There will be seven episodes in season two, and they’ll all be available from May 16. Josh Brolin’s back, of course, and so are all the key actors from the season one cast – including some who you might not expect to see this time around.
Without giving away any spoilers, the first season of Outer Range eventually kinda sorta answered the question of what the hell was going on out there, and how Brolin’s character Royal Abbott and Imogen Poots’ Autumn ended up where they are. In season two, we’re likely to see more of that backstory, and we’re likely to discover more about the mystery around the once-missing Rebecca.
The second season may feel slightly different from the first because there’s a new showrunner this time around: Brian Watkins, who wrote season one and was the showrunner too, has been replaced by Charles Murray. And Brolin directs one of the episodes in season two.
I’m excited about this, because it’s fun to have a show where you really have no idea what’s going to happen next: while it sometimes feels like it’s going to crash under the weight of its multiple story strands, it hasn’t done that so far. As Wonderfully Weird and Horrifying says, “Outer Range proves to be unique, alluring, and infinitely odd… expertly acted, and wonderfully strange, it’s a hit.”
Get the hottest deals available in your inbox plus news, reviews, opinion, analysis, deals and more from the TechRadar team.
Star Wars Outlaws will feature a Jabba the Hutt mission, but it’s locked behind the game’s season pass.
Last week (April 9), Ubisoft announced that Massive Entertainment’s open-world Star Wars game will be launching in August and also released a brand-new story trailer that revealed the details about pre-orders and its season pass.
While Star Wars Outlaws will be a single-player experience, players can expect two additional narrative expansions to arrive post-launch, one of which is called Jabba’s Gambit – “a Day 1 exclusive mission” featuring the iconic crime lord.
“Play the exclusive Jabba’s Gambit mission at launch,” the details for the season pass reads. “Just as Kay is putting together a crew for the Canto Bight heist, she receives a job from Jabba the Hutt himself. Turns out that ND-5 owes Jabba a debt from years ago, and he has come to collect.”
However, it’s now been confirmed that Jabba’s Gambit won’t be playable in the $70 / £56 base game and instead, users will need access to the season pass to gain access.
This means players will be required to spend an extra $40 / £32 alongside the standard edition, $110 / £94.99 for the Gold Edition, or $130 / £114 for the Ultimate Edition.
Each edition also comes with its respective pre-order and bonus goodies as well. Still, the high cost of each version has players expressing frustration online, with some calling out Ubisoft for putting additional content behind a paywall.
Get the hottest deals available in your inbox plus news, reviews, opinion, analysis, deals and more from the TechRadar team.
Although Ubisoft nor Massive Entertainment have not responded to the criticism, the publisher has provided a statement clarifying the purchase options.
“To clarify, Jabba the Hutt and the Hutt Cartel are one of the main syndicates in Star Wars Outlaws and will be part of the experience for everyone who purchases the game, regardless of edition,” a Ubisoft representative told PC Gamer.
“The ‘Jabba’s Gambit’ mission is an optional, additional mission with the Hutt Cartel along Kay and Nix’s journey across the Outer Rim. This mission will be available to those who purchase the season pass or an edition of the game which includes the season pass.”
Sony is reportedly asking developers to get their games ready for PS5 Pro optimization, including existing titles and ones still in development.
A new report from The Verge suggests that the PS5 Pro will place a greater emphasis on powerful graphical features like ray tracing, thus directly addressing one of the base PlayStation 5’s biggest weaknesses: its inability (for the most part) to run ray traced games at a stable framerate.
Reportedly, Sony is preparing a “Trinity Enhanced” label (named after the PS5 Pro’s codename) to indicate that a game has been optimized for PS5 Pro. This will apply to both upcoming games and updates for titles that already exist. Enhancements could come in the form of fully-realized ray tracing, as well as improvements to resolution and overall performance.
This push to get games ready for the mid-generation refresh seems to line up with the previously leaked PS5 Pro specs that suggest the upgraded console is allegedly “about 45% faster than standard PlayStation 5” and that Trinity Enhanced titles “can support higher resolution and frame rate.”
That Trinity Enhanced label doesn’t sound at all dissimilar to the way many Xbox games feature Series X and Series S-enhanced patches. However, this is typically given to either games from prior console generations or games that are also available on Xbox One.
Some PS4 games have received current-gen patches already, but this Trinity Enhanced label seems like it’s going to apply to games native to PS5. If so, it’s pretty exciting news for those planning on upgrading to PS5 Pro if and when it eventually releases. We’re already seeing some of the best PS5 games like Dragon’s Dogma 2and Final Fantasy 16struggle to maintain consistently high framerates on the base console. So any enhancements here will be very welcome indeed.
You might also like…
Get the hottest deals available in your inbox plus news, reviews, opinion, analysis, deals and more from the TechRadar team.