Connect with us

Tech News

Live captioning via AI is now available for Zoom, if a little limited

(TECH NEWS) In order to be more inclusive, and improve the share of information with your team, live captioning is a great option for your next Zoom call.

Published

on

Zoom live captioning

The ubiquitous all-father Zoom continues to prompt innovation–and in a time during which most companies are still using some form of remote communication, who can blame them? It’s only fitting that someone would come along and try to flesh out Zoom’s accessibility features at some point, which is exactly what Zoom Live Captioning sets out to accomplish.

Zoom Live Captioning is a Zoom add-on service that promises, for a flat fee, to caption up to 80 hours per month of users’ meetings via an easy-to-implement plugin. The allure is clear: a virtual communication environment that is more time-efficient, more accessible, and more flexible for a variety of usage contexts.

Unfortunately, what’s less clear is how Zoom Live Captioning proposes to achieve this goal.

The live-captioning service boasts, among other things, “limited lag” and “the most accurate [speech-to-text AI] in the world”–a service that, despite its sensational description, is still only available in English. Furthermore, anyone who has experienced auto-captioning on YouTube videos–courtesy of one of the largest technology initiatives in the world–knows that, even with crystal-clear audio, caption accuracy is questionable at best.

Try applying that level of moving-target captioning to your last Zoom call, and you’ll see what the overarching problem here is.

Even if your Zoom call has virtually no latency, everyone speaks clearly and enunciates perfectly, your entire team speaks conversational English at a proficient degree across the board, and no one ever interrupts or experiences microphone feedback, it seems reasonable to expect that captions would still be finicky. Especially if you’re deaf or hard of hearing–a selling point Zoom Live Captioning drives home–this is a problematic flaw in a good idea.

Now, it’s completely fair to postulate that any subtitles are better than no subtitles at all. If that’s the decision you’d like to make for your team, Zoom Live Captioning starts at $20 per person per month; larger teams are encouraged to contact the company to discuss more reasonable rates if they want to incorporate live captioning across an enterprise.

Nothing would be better for speech-to-text innovation than being wrong about Zoom Live Captioning’s potential for inaccuracy, but for now, it’s safe to be a little skeptical.

Jack Lloyd has a BA in Creative Writing from Forest Grove's Pacific University; he spends his writing days using his degree to pursue semicolons, freelance writing and editing, oxford commas, and enough coffee to kill a bear. His infatuation with rain is matched only by his dry sense of humor.

Tech News

This app connects music fans with their favorite bands

(TECH NEWS) With the Band, a Nashville-based company, is using tech to reshape virtual concerts and fandom experiences for music fans during COVID-19.

Published

on

Music concert crowd no longer safe but can be experienced virtually.

Nothing beats the experience of seeing your favorite artist live – except maybe that moment when you look next to you to see that others are feeling the music just as much as you are. Musical communities are a truly special bond that aren’t location specific. Perhaps that’s why fan engagement platforms, such as Patreon and Memberful, are so successful in cultivating online fanbases.

An app in the fandom world that has been making cutting-edge headway in the COVID-19 concert-less era is Nashville-based With the Band. The fan engagement platform, which connects artists with fans and fans with each other, has found itself in a pivotal position – how can they expand engagement to fill the growing needs during quarantine?

Before COVID, the app was used primarily to empower music fans and artists to create and participate in fan projects and meet ups. Perhaps the most notable example of a With the Band moment was September of 2019, where fans organized for 16,000 signs to be distributed at a Jonas Brothers concert in Nashville.

Since COVID-19, however, the platform has had to adjust to a live concert-less world. How are they doing? Pretty good in my opinion

With the Band has a new (and exciting!) feature called Fan Crews, which is a modern day, virtual version of a fan club that even Dr. Fauci could get behind.

With Fan Crews, artists will be able to engage with their fan bases (and monetize their brand) through:

  • Posting
  • Private messaging
  • Virtual meet & greets
  • Live streams (the modern-day concert?)
  • Exclusive content
  • Special giveaways
  • & much more

The most helpful feature of Fan Crews is that artists and artists teams will have access to an analytic dashboard, where they can see data pertaining to their fan base – all at a zero start-up cost to the artist!

Founder and CEO Sarah Beth Perry – a boyband fangirl – began the With the Band venture from her dorm room in 2017. Now, just three short years and a global pandemic later With the Band has grown in size and scale, and just might be the best thing to happen to fandom since everything went virtual.

Coronavirus has threatened the music industry from all angles – live concerts must abide by CDC guidelines, which means decreased profit for everyone. Fan meet ups and events have had to go mostly digital, putting the onus on tech features that allow for online fan engagement. Artists are losing money during this time, and fans are not able to engage with the artists and each other in the capacity they crave.

If the COVID-induced crumbling live concert industry is a call, With the Band’s Fan Crews is one hell of a response. I’m excited to see what artists and fans do with their new, full-integrated platform.

Continue Reading

Tech News

What is “Among Us”? The meme sensation two years in the making

(TECH NEWS) When a game has invaded even the most focused of social media feeds, we have to figure out what it’s all about. Enter Among Us.

Published

on

Among Us game cover, the latest game meme sensation.

If you’ve been seeing bean-shaped characters pop up in memes, on Twitch, or even on Facebook saying words like “Impostor” or “Red is sus”, you’re not alone.

Among Us, an online multiplayer social deduction game has taken the online world by storm as of late. Originally released back in 2018, the game gained a massive surge in popularity during the COVID-19 lockdown. According to Sensor Tower’s data, the game passed 100 million downloads on the IOS App Store and Google Play in Q3 of 2020 alone. While the game is free to play on mobile, users can also play on PC for a small fee of $4.99. As it stands, Among Us is currently the third-most played game on Steam, with a solid chance it breaks into the top spot in the next few months.

Haven’t played the game? Well, let’s cover the basics so you understand the endless number of memes coming your way.

The game is played with 4 to 10 people, all of whom are placed together on a single map. Depending on the game settings, 1 to 3 of these people will be randomly assigned as Impostors, whose goal is to kill a certain number of non-Impostors without getting voted off of the map. The rest of the users will be designated as Crewmates, who can win the game by either completing a set number of assigned tasks in the form of minigames or by voting the Impostors off of the map. Impostors gain the advantage of being able to use portions of the map (like vents) that Crewmates cannot, as well as being assigned fake tasks so it can appear that they are a Crewmate. Impostors can also sabotage areas of the map that will require Crewmates to complete an additional task within an allotted time, with failure to do so resulting in an Impostor team win.

Impostors will be able to move across the map and kill other players they are next too, turning those players into Ghosts who will still need to complete their tasks for the Crewmates to win. When a player finds a dead body, they can report it, which essentially allows for a time-based discussion and the option to vote for someone to be kicked off of the map. Each player can also use one “emergency meeting”, which can call for a discussion and vote at any time. Since players are allotted a cone of vision that allows them to only see other players within a certain distance, the game relies a lot on convincing other users you are not an Imposter.

Among Us was inspired by the party game Mafia, proving that a few adjustments to a classic concept can pay dividends. Due to the mostly chat-based dialogue, memes have popped up of Crewmates accusing people of being suspicious by saying they are “sus” based on their actions. There has also been a rise in memes highlighting a group of people saying someone must be an Impostor and voting them off, only to view the “X was not the Impostor” dialogue from the game.

Hopefully, this helps you understand some of the bean shape images you’ve been seeing recently. With the game rising rapidly on streaming platforms over the summer, it’s unlikely the wave of memes and references to the game will end anytime soon. If you still don’t understand it, then I recommend you take the plunge and play the game—after all, it’s free on mobile.

Continue Reading

Tech News

Snapchat is among the first to leverage Apple’s new powerful AR tools

(TECH NEWS) Apple has announced the iPhone 12 Pro’s LiDAR scanner that will take AR to a whole new level, and Snapchat is already leveraging the technology in its Lens Studio 3.2.

Published

on

Phone taking picture of food shows potential of AR

Augmented Reality (AR) uses computer-generated information to create an enhanced and interactive experience of the world. It intertwines the physical world with the digital one to make it more entertaining and fun. And, this week Apple unveiled its latest phone models, iPhone 12 Pro and iPhone 12 Max, and along with it, its custom-designed LiDAR scanner.

LiDAR stands for Light Detection and Ranging, and it measures how long it takes light to reach an object and reflect it back. With the sensor, the new iPhone’s machine learning capabilities, and the iOS 14 framework, the iPhone can “understand the world around you.” “LiDAR makes iPhone 12 Pro a powerful device for delivering instant AR and unlocking endless opportunities in apps,” said iPhone Product Line Manager, Francesca Sweet.

Apple says their new technology will help enable object and room scanning, photo and video effects, and precise placements of AR objects. With LiDAR’s ability to “see in the dark”, the sensor can autofocus in low-light six times faster. In doing so, it improves focus accuracy and reduces capture time “so your subject is clearly in focus without missing the moment.”

And, Snapchat is making sure it isn’t missing the moment either. The company is among the first to leverage iPhone 12 Pro’s LiDAR scanner for AR on its iOS app. On Wednesday, Snapchat announced it is launching Lens Studio 3.2, which will allow creators and developers to build their LiDAR-powered lenses for the iPhone 12 Pro.

“The addition of the LiDAR Scanner to iPhone 12 Pro models enables a new level of creativity for augmented reality,” said Eitan Pilipski, Snap’s SVP of Camera Platform. “We’re excited to collaborate with Apple to bring this sophisticated technology to our Lens Creator community.”

According to a Lens Studio article, the new iPhone 12 Pro AR experience will have a better understanding of geometry and the meaning of surfaces and objects. It will let Snapchat’s camera “see a metric scale of the scene”, which will allow “Lenses to interact realistically with the surrounding world.”

Even though the iPhone 12 Pro isn’t here yet, this isn’t stopping Snapchat from letting creators and developers start bringing their “LiDAR-powered Lenses to life.” Its new and interactive preview mode in Lens Studio 3.2 will already allow them to do that. So, if you’d like to get started, you can download the template on their site.

According to Apple, the new iPhone 12 Pro’s LiDAR scanner “puts advanced depth-mapping technology in your pocket.” Overall, Apple’s new technology has fancy sensors that will allow you to take top-quality photos and videos in low-light. It will also allow you to create an AR experience that should be better than what exists now. During Apple’s announcement, they said all these new “incredible pro technologies” won’t come with a higher price tag. I guess it’s up to you whether you really need the fancy new iPhone 12 Pro to play with the new lenses in Snapchat.

Continue Reading

Our Great Partners

The
American Genius
news neatly in your inbox

Subscribe to our mailing list for news sent straight to your email inbox.

Emerging Stories

Get The American Genius
neatly in your inbox

Subscribe to get business and tech updates, breaking stories, and more!