Golden unveils a Wikipedia alternative focused on emerging tech and startups

Jude Gomila, who previously sold his mobile advertising company Heyzap to RNTS Media, is taking on a new challenge — building a “knowledge base” that can fill in Wikipedia’s blind spots, particularly when it comes to emerging technologies and startups.

While Gomila is officially launching Golden today, it’s already full of content about things like the latest batch of Y Combinator startups and morphogenetic engineering. And it’s already raised $5 million from Andreessen Horowitz, Gigafund, Founders Fund, SV Angel, Liquid 2 Ventures/Joe Montana, plus a long list of individual angel investors including Gomila’s Heyzap co-founder Immad Akhund.

To state the obvious: Wikipedia is an incredibly useful website, but Gomila pointed out that notable companies and technologies like SV Angel, Benchling, Lisk and Urbit don’t currently have entries. Part of the problem is what he called Wikipedia’s “arbitrary notability threshold,” where pages are deleted for not being notable enough. (Full disclosure: This is also what happened year ago to the Wikipedia page about yours truly — which I swear I didn’t write myself.)

Perhaps that threshold made sense when Wikipedia was just getting started and the infrastructure costs were higher, but Gomila said it doesn’t make sense now. In determining what should be included in Golden, he said the “more fundamental” question is more about existence: “Does this company exist? Does Anthony Ha exist?” If so, there’s a good chance that it should have a page on Golden, at least eventually.

In his blog post outlining his vision for the site, Gomila wrote:

We live in an age of extreme niches, an age when validation and completeness is more important than notability. Our encyclopedia on Golden doesn’t have limited shelf space — we eventually want to map everything that exists. Special relativity was not notable to the general public the moment Einstein released his seminal paper, but certainly was later on — could this have been the kind of topic to be removed from the world’s canon if it was discovered today?

Golden homepage

Gomila said he’s also bringing some new technologies and fresh approaches to the problem. Some of this is pretty straightforward, like allowing users to embed video, academic appears and other multimedia content onto Golden pages.

At the same time, he’s hoping to make it much easier to write and edit Golden pages. You do so in a WYSIWYG editor that doesn’t require you to know any HTML, and the site will help you with automated suggestions, for example pulling out author and title information when you’re adding a link to another site.

Gomila said that this will allow users to work much more quickly, so that “one hour spent on Golden is effectively 100 hours on other platforms.”

There’s also an emphasis on transparency, which includes features like “high resolution citations” (citations that make it extra clear which statement you’re trying to provide evidence for) and the fact that Golden account names are tied to your real identity — in other words, you’re supposed to edit pages under your own name. Gomila said the site backs this up with bot detection and “various protection mechanisms” designed to ensure that users aren’t pretending to be someone they’re not.

“I’m sure there will always be trolls up their usual tricks, but they will be on the losing side,” he told me.

AI Suggestions

If you think someone has added incorrect or misleading information to a page, you can flag it as an issue. Gomila suggested AI could also play a more editorial role by pointing out when someone is using language that’s biased or seems too close to marketing-speak.

“AI can have bias and humans can have bias,” he acknowledged, but he’s hoping that both elements working together can help Golden get closer to the truth. He added that “rather than us editorially changing things, our team will act like normal users” who can edit and flag issues.

Golden is available to users for free, without advertising. Gomila said his initial plan for making money is charging investment funds and large companies for a more sophisticated query tool.

Source: Tech Crunch

Oculus announces a VR subscription service for enterprises

Oculus is getting serious about monetizing VR for enterprise.

The company has previously sold specific business versions of the headsets, but now they’re adding a pricey annual device management subscription.

Oculus Go for business starts at $599 (64 GB) and the enterprise Oculus Quest starts at $999 (128 GB). These fees include the first year of enterprise device management and support which goes for $180 per year per device.

Here’s what that fee gets you:

This includes a dedicated software suite offering device setup and management tools, enterprise-grade service and support, and a new user experience customized for business use cases.

The new Oculus for Business launches in the fall.

Source: Tech Crunch

Developers can now verify mobile app users over WhatsApp instead of SMS

Facebook today released a new SDK that allows mobile app developers to integrate WhatsApp verification into Account Kit for iOS and Android. This will allow developers to build apps where users can opt to receive their verification codes through the WhatsApp app installed on their phone, instead through SMS.

Today, many apps give users the ability to sign up using only a phone number — a now popular alternative to Facebook Login, thanks to the social network’s numerous privacy scandals which led to fewer people choosing to use Facebook with third-party apps.

Plus, using phone numbers to sign up is common with a younger generation of users who don’t have Facebook accounts — and sometimes barely use email, except for joining apps and services.

When using a phone number to sign in, it’s common for the app to confirm the user by sending a verification code over SMS to the number provided. The user then enters that code to create their account. This process can also be used when logging in, as part of a multi-factor verification system where a user’s account information is combined with this extra step for added security.

While this process is straightforward and easy enough to follow, SMS is not everyone’s preferred messaging platform. That’s particularly true in emerging markets like India, where 200 million people are on WhatsApp, for example. In addition, those without an unlimited messaging plan are careful not to overuse texting when it can be avoided.

That’s where the WhatsApp SDK comes in. Once integrated into an iOS or Android app, developers can offer to send users their verification code over WhatsApp instead of text messaging. They can even choose to disable SMS verification, notes Facebook.

This is all a part of WhatsApp’s Account Kit, which is a larger set of developer tools designed to allow people to quickly register and login to apps or websites using only a phone number and email, no password required.

This WhatsApp verification codes option has been available on WhatsApp’s web SDK since late 2018, but hadn’t been available with mobile apps until today.

Source: Tech Crunch

Google employees are staging a sit-in to protest reported retaliation

Google employees are staging a sit-in tomorrow to protest the alleged retaliation at the hands of managers toward employees. The plan is to stage the sit-in tomorrow at 11 a.m.

“From being told to go on sick leave when you’re not sick, to having your reports taken away, we’re sick of retaliation,” Google employees tweeted via @GoogleWalkout. “Six months ago, we walked out. This time, we’re sitting in.”

Google declined to comment on the sit-in but pointed to its previous statement regarding retaliation:

“We prohibit retaliation in the workplace and publicly share our very clear policy,” a Google spokesperson told TechCrunch. “To make sure that no complaint raised goes unheard at Google, we give employees multiple channels to report concerns, including anonymously, and investigate all allegations of retaliation.”

This comes six months after 20,000 Google employees walked out following the company’s mishandling of sexual harassment allegations. Last week, two Google employees accused the company of retaliating against them for organizing the walkout, Wired first reported.

Meredith Whittaker, the lead of Google’s Open Research and one of the organizers of the walkout, said her role was “changed dramatically.” Fellow walkout organizer Claire Stapleton said her manager told her she would be demoted and lose half of her reports.

That was followed by an employee-led town hall meeting to hear from other employees who had faced retaliation at Google. Yesterday, Googlers publicly shared additional stories of retaliation on Medium. Here’s one:

My retaliators were punished with “coaching”

I reported my tech lead to my manager for sexual harassment, but my manager thought I was “overreacting.” I then reported my manager, as I could no longer feel comfortable working with this colleague every day while no action was being taken. The tech lead provided unsolicited feedback in my perf that took four months for the perf team to remove. The manager boxed me out and denied my promotion nomination by my peers. Eventually HR found there was retaliation but simply offered “coaching” to the teach lead and manager. I was asked to accept this. I refused. No additional actions were taken. They both still work at Google.

In response, Google Global Director of Diversity, Equity & Inclusion Melonie Parker began publicly sharing the company’s workplace policies on harassment, discrimination and retaliation. That policy specifically states Google prohibits retaliation for “raising a concern about a violation of policy or law or participating in an investigation relating to a violation of policy or law. Retaliation means taking an adverse action against an employee or TVC as a consequence of reporting, for expressing an intent to report, for assisting another employee in an effort to report, for testifying or assisting in a proceeding involving sexual harassment under any federal, state or local anti-discrimination law, or for participating in the investigation of what they believe in good faith to be a possible violation of our Code of Conduct, Google policy or the law.”

Source: Tech Crunch

Hackers went undetected in Citrix’s internal network for six months

Hackers gained access to technology giant Citrix’s networks six months before they were discovered, the company has confirmed.

In a letter to California’s attorney general, the virtualization and security software maker said the hackers had “intermittent access” to its internal network from October 13, 2018 until March 8, 2019, two days after the FBI alerted the company to the breach.

Citrix said the hackers “removed files from our systems, which may have included files containing information about our current and former employees and, in limited cases, information about beneficiaries and/or dependents.”

Initially the company said hackers stole business documents. Now it’s saying the stolen information may have included names, Social Security numbers and financial information.

Citrix said in a later update on April 4 that the attack was likely a result of password spraying, which attackers use to breach accounts by brute-forcing from a list of commonly used passwords that aren’t protected with two-factor authentication.

We asked Citrix how many staff were sent data-breach notification letters, but a spokesperson did not immediately comment.

Under California law, the authorities must be informed of a breach if more than 500 state residents are involved.

Read more:

Source: Tech Crunch

Diving into TED2019, the state of social media, and internet behavior

Extra Crunch offers members the opportunity to tune into conference calls led and moderated by the TechCrunch writers you read every day. Last week, TechCrunch’s Anthony Ha gave us his recap of the TED2019 conference and offered key takeaways from the most interesting talks and provocative ideas shared at the event.

Under the theme, ‘Bigger Than Us’, the conference featured talks, Q&A’s, and presentations from a wide array of high-profile speakers, including an appearance from Twitter CEO Jack Dorsey which was the talk of the week. Anthony dives deeper into the questions raised in his onstage interview that kept popping up: How has social media warped our democracy? How can the big online platforms fight back against abuse and misinformation? And what is the Internet good for, anyway?

“…So I would suggest that probably five years ago, the way that we wrote about a lot of these tech companies was too positive and they weren’t as good as we made them sound. Now the pendulum has swung all the way in the other direction, where they’re probably not as bad we make them sound…

…At TED, you’d see the more traditional TED talks about, “Let’s talk about the magic of finding community in the internet.” There were several versions of that talk this year. Some of them very good, but now you have to have that conversation with the acknowledgement that there’s much that is terrible on the internet.”

Ivan Poupyrev

Image via Ryan Lash / TED

Anthony also digs into what really differentiates the TED conference from other tech events, what types of people did and should attend the event, and even how he managed to get kicked out of the theater for typing too loud.

For access to the full transcription and the call audio, and for the opportunity to participate in future conference calls, become a member of Extra Crunch. Learn more and try it for free. 

Source: Tech Crunch

Why did last night’s ‘Game of Thrones’ look so bad? Here comes the science!

Last night’s episode of “Game of Thrones” was a wild ride and inarguably one of an epic show’s more epic moments — if you could see it through the dark and the blotchy video. It turns out even one of the most expensive and meticulously produced shows in history can fall prey to the scourge of low quality streaming and bad TV settings.

The good news is this episode is going to look amazing on Blu-ray or potentially in future, better streams and downloads. The bad news is that millions of people already had to see it in a way its creators surely lament. You deserve to know why this was the case. I’ll be simplifying a bit here because this topic is immensely complex, but here’s what you should know.

(By the way, I can’t entirely avoid spoilers, but I’ll try to stay away from anything significant in words or images.)

It was clear from the opening shots in last night’s episode, “The Longest Night,” that this was going to be a dark one. The army of the dead faces off against the allied living forces in the darkness, made darker by a bespoke storm brought in by, shall we say, a Mr. N.K., to further demoralize the good guys.

If you squint you can just make out the largest army ever assembled

Thematically and cinematographically, setting this chaotic, sprawling battle at night is a powerful creative choice and a valid one, and I don’t question the showrunners, director, and so on for it. But technically speaking, setting this battle at night, and in fog, is just about the absolute worst case scenario for the medium this show is native to: streaming home video. Here’s why.

Compression factor

Video has to be compressed in order to be sent efficiently over the internet, and although we’ve made enormous strides in video compression and the bandwidth available to most homes, there are still fundamental limits.

The master video that HBO put together from the actual footage, FX, and color work that goes into making a piece of modern media would be huge: hundreds of gigabytes if not terabytes. That’s because the master has to include all the information on every pixel in every frame, no exceptions.

Imagine if you tried to “stream” a terabyte-sized video file. You’d have to be able to download 200 megabytes per second for the full 80 minutes of this episode. Few people in the world have that kind of connection — it would basically never stop buffering. Even 20 megabytes per second is asking too much by a long shot. 2 is doable — slightly under the 25 megabit speed (that’s bits… divide by 8 to get bytes) we use to define broadband download speeds.

So how do you turn a large file into a small one? Compression — we’ve been doing it for a long time, and video, though different from other types of data in some ways, is still just a bunch of zeroes and ones. In fact it’s especially susceptible to strong compression because of how one video frame is usually very similar to the last and the next one. There are all kinds of shortcuts you can take that reduce the file size immensely without noticeably impacting the quality of the video. These compression and decompression techniques fit into a system called a “codec.”

But there are exceptions to that, and one of them has to do with how compression handles color and brightness. Basically, when the image is very dark, it can’t display color very well.

The color of winter

Think about it like this: There are only so many ways to describe colors in a few words. If you have one word you can say red, or maybe ochre or vermilion depending on your interlocutor’s vocabulary. But if you have two words you can say dark red, darker red, reddish black, and so on. The codec has a limited vocabulary as well, though its “words” are the numbers of bits it can use to describe a pixel.

This lets it succinctly describe a huge array of colors with very little data by saying, this pixel has this bit value of color, this much brightness, and so on. (I didn’t originally want to get into this, but this is what people are talking about when they say bit depth, or even “highest quality pixels.)

But this also means that there are only so many gradations of color and brightness it can show. Going from a very dark grey to a slightly lighter grey, it might be able to pick 5 intermediate shades. That’s perfectly fine if it’s just on the hem of a dress in the corner of the image. But what if the whole image is limited to that small selection of shades?

Then you get what we see last night. See how Jon (I think) is made up almost entirely of only a handful of different colors (brightnesses of a similar color, really) in with big obvious borders between them?

This issue is called “banding,” and it’s hard not to notice once you see how it works. Images on video can be incredibly detailed, but places where there are subtle changes in color — often a clear sky or some other large but mild gradient — will exhibit large stripes as the codec goes from “darkest dark blue” to “darker dark blue” to “dark blue,” with no “darker darker dark blue” in between.

Check out this image.

Above is a smooth gradient encoded with high color depth. Below that is the same gradient encoded with lossy JPEG encoding — different from what HBO used, obviously, but you get the idea.

Banding has plagued streaming video forever, and it’s hard to avoid even in major productions — it’s just a side effect of representing color digitally. It’s especially distracting because obviously our eyes don’t have that limitation. A high-definition screen may actually show more detail than your eyes can discern from couch distance, but color issues? Our visual systems flag them like crazy. You can minimize it, but it’s always going to be there, until the point when we have as many shades of grey as we have pixels on the screen.

So back to last night’s episode. Practically the entire show took place at night, which removes about 3/4 of the codec’s brightness-color combos right there. It also wasn’t a particularly colorful episode, a directorial or photographic choice that highlighted things like flames and blood, but further limited the ability to digitally represent what was on screen.

It wouldn’t be too bad if the background was black and people were lit well so they popped out, though. The last straw was the introduction of the cloud, fog, or blizzard, whatever you want to call it. This kept the brightness of the background just high enough that the codec had to represent it with one of its handful of dark greys, and the subtle movements of fog and smoke came out as blotchy messes (often called “compression artifacts” as well) as the compression desperately tried to pick what shade was best for a group of pixels.

Just brightening it doesn’t fix things, either — because the detail is already crushed into a narrow range of values, you just get a bandy image that never gets completely black, making it look washed out, as you see here:

(Anyway, the darkness is a stylistic choice. You may not agree with it, but that’s how it’s supposed to look and messing with it beyond making the darkest details visible could be counterproductive.)

Now, it should be said that compression doesn’t have to be this bad. For one thing, the more data it is allowed to use, the more gradations it can describe, and the less severe the banding. It’s also possible (though I’m not sure where it’s actually done) to repurpose the rest of the codec’s “vocabulary” to describe a scene where its other color options are limited. That way the full bandwidth can be used to describe a nearly monochromatic scene even though strictly speaking it should be only using a fraction of it.

But neither of these are likely an option for HBO: Increasing the bandwidth of the stream is costly, since this is being sent out to tens of millions of people — a bitrate increase big enough to change the quality would also massively swell their data costs. When you’re distributing to that many people, that also introduces the risk of hated buffering or errors in playback, which are obviously a big no-no. It’s even possible that HBO lowered the bitrate because of network limitations — “Game of Thrones” really is on the frontier of digital distribution.

And using an exotic codec might not be possible because only commonly used commercial ones are really capable of being applied at scale. Kind of like how we try to use standard parts for cars and computers.

This episode almost certainly looked fantastic in the mastering room and FX studios, where they not only had carefully calibrated monitors with which to view it but also were working with brighter footage (it would be darkened to taste by the colorist) and less or no compression. They might not even have seen the “final” version that fans “enjoyed.”

We’ll see the better copy eventually, but in the meantime the choice of darkness, fog, and furious action meant the episode was going to be a muddy, glitchy mess on home TVs.

And while we’re on the topic…

You mean it’s not my TV?

Couple watching TV on their couch.Well… to be honest, it might be that too. What I can tell you is that simply having a “better” TV by specs, such as 4K or a higher refresh rate or whatever, would make almost no difference in this case. Even built-in de-noising and de-banding algorithms would be hard pressed to make sense of “The Long Night.” And one of the best new display technologies, OLED, might even make it look worse! Its “true blacks” are much darker than an LCD’s backlit blacks, so the jump to the darkest grey could be way more jarring.

That said, it’s certainly possible that your TV is also set up poorly. Those of us sensitive to this kind of thing spend forever fiddling with settings and getting everything just right for exactly this kind of situation.

Usually “calibration” is actually a pretty simple process of making sure your TV isn’t on the absolute worst settings, which unfortunately many are out of the box. Here’s a very basic three-point guide to “calibrating” your TV:

  1. Go through the “picture” or “video” menu and turn off anything with a special name, like “TrueMotion,” “Dynamic motion,” “Cinema mode,” or anything like that. Most of these make things look worse, especially anything that “smooths” motion. Turn those off first and never ever turn them on again. Don’t mess with brightness, gamma, color space, anything you have to turn up or down from 50 or whatever.
  2. Figure out lighting by putting on a good, well-shot movie in the situation you usually watch stuff — at night maybe, with the hall light on or whatever. While the movie is playing, click through any color presets your TV has. These are often things like “natural,” “game,” “cinema,” “calibrated,” and so on and take effect right away. Some may make the image look too green, or too dark, or whatever. Play around with it and whichever makes it look best, use that one. You can always switch later – I myself switch between a lighter and darker scheme depending on time of day and content.
  3. Don’t worry about HDR, dynamic lighting, and all that stuff for now. There’s a lot of hype about these technologies and they are still in their infancy. Few will work out of the box and the gains may or may not be worth it. The truth is a well shot movie from the ’60s or ’70s can look just as good today as a “high dynamic range” show shot on the latest 8K digital cinema rig. Just focus on making sure the image isn’t being actively interfered with by your TV and you’ll be fine.

Unfortunately none of these things will make “The Long Night” look any better until HBO releases a new version of it. Those ugly bands and artifacts are baked right in. But if you have to blame anyone, blame the streaming infrastructure that wasn’t prepared for a show taking risks in its presentation, risks I would characterize as bold and well executed, unlike the writing in the show lately. Oops, sorry, couldn’t help myself.

If you really want to experience this show the way it was intended, the fanciest TV in the world wouldn’t have helped last night, though when the Blu-ray comes out you’ll be in for a treat. But here’s hoping the next big battle takes place in broad daylight.

Source: Tech Crunch

Interactive content is coming to Walmart’s Vudu & the BBC

Netflix’s early experiments with interactive content may not have always hit the mark. Its flagship effort on this front, Black Mirror: Bandersnatch, was a frustrating experiment — and now, the subject of a lawsuit. But the industry has woken up to the potential of personalized programming. Not only is Netflix pursuing more interactive content, including perhaps a rom-com, others are following suit with interactive offerings of their own, including Amazon, Google — and now, it seems — Walmart and the BBC.

A couple of months ago, Amazon’s e-book division Audible launched professionally performed audio stories for Alexa devices in order to test whether voice-controlled choose-your-own-adventure style narratives would work on smart speakers, like the Amazon Echo.

YouTube is also developing interactive programming and live specials, including its own choose-your-own-adventure-style shows.

Now, according to a new report from Bloomberg, Walmart is placing its own bet on interactive media — but with an advertising-focused twist. Through its investment in interactive media company Eko, Walmart will debut several new shows for its streaming service Vudu that feature “shoppable” advertisements. That is, instead of just seeing an ad for a product that Walmart carries, customers will be able to buy the products seen in the shows, too.

Bloomberg’s report is light on the details — more is expected at Walmart’s Newfronts announcement this week — but Eko has already developed ads tied to interactive TV where the ad that plays matches the emotion of the viewer/participant, based on their choices within the branching narrative. It also created ads that viewers click their way through, seeing different versions of the ad’s story with each click.

And today, the BBC announced it’s venturing into interactive content for the first time, too.

As part of its NewFronts announcements, the broadcaster unveiled its plans for interactive news programming within its technology news show Click.

For the show’s 1,000th episode airing later this year, it will introduce a full-length branching narrative episode, where the experience is personalized and localized to individual viewers. Unlike choose-your-own-adventure style programs that present only a few options to pick from, viewers will also answer questions at the beginning of the show to tailor their experience.

Part of the focus will be on presenting different versions of the program based on the viewer’s own technical knowledge, the BBC said.

A team of a dozen coders is currently building the episode, so the broadcaster can’t yet confirm how many different variations will be available in the end, or what topics will be featured on the episode. However, one topic being considered is lab-grown meat, we’re told.

The BBC says it’s very much planning to make interactivity an ongoing effort going forward.

This collective rush to interactive, personalized programming may lead some to believe this is indeed the next big thing in media and entertainment. But the reality is that these shows are costly to produce and difficult to scale compared with traditional programming. Plus, viewer reaction has been mixed so far.

Some may decide further experiments aren’t worth pursuing if they don’t produce a bump in viewership, subscriber numbers, or advertiser click-throughs — depending on which metric they care about.

In the meantime, though, it will be interesting to see these different approaches to interactive content make their debut.

Source: Tech Crunch

WeWork files confidentially for IPO

WeWork, the co-working giant now known as The We Company, has submitted confidential documents to the U.S. Securities and Exchange Commission for an initial public offering, the company confirmed in a press release Monday.

According to The New York Times, the business initially filed IPO paperwork in December.

WeWork, valued at $47 billion in January, has raised $8.4 billion in a combination of debt and equity funding since it was founded by Adam Neumann and Miguel McKelvey in 2010. WeWork is among several tech unicorns with hundreds of millions, billions actually, in backing from the SoftBank Vision Fund. Recently, the Japanese telecom giant eyed a majority stake in the company worth $16 billion, but cooled their jets at the last minute.

WeWork doubled its revenue from $886 million in 2017 to roughly $1.8 billion in 2018, with net losses hitting a staggering $1.9 billion. These aren’t attractive metrics for a pre-IPO business; then again, Uber’s currently completing a closely watched IPO roadshow despite shrinking growth. Here’s more from Crunchbase News on WeWork’s top line financials:

  • WeWork’s 2017 revenue: $886 million
  • WeWork’s 2017 net loss: $933 million
  • WeWorks 2018 revenue: $1.82 billion (+105.4 percent)
  • WeWork’s 2018 net loss: $1.9 billion (+103.6 percent)

On the bright side, per Axios, WeWork established a 90 percent occupancy rate in 2018, with total membership rising 116 percent to 401,000.

WeWork is often referenced as the perfect example of Silicon Valley’s tendency to inflate valuations. WeWork, a real estate business, burns through cash rapidly and will undoubtedly have to work hard to convince public markets investors of its longevity, as well as its status as a tech company.

WeWork is backed by SoftBank, Benchmark, T. Rowe Price, Fidelity, Goldman Sachs and several others.

Source: Tech Crunch

Overcast makes it easy to turn clips from podcasts into viral clips

The popular iOS podcasts app Overcast wants to make it easier for people to share across social media clips from their favorite shows. The feature will likely be well-received by podcasters looking to expand their show’s audience, as they’ve previously been limited to sharing their podcast by way of links or audio-only snippets, for the most part. Overcast’s solution, meanwhile, allows anyone to share either an audio or a video clip from any public podcast, the company said in an announcement.

That means a show’s fans can get in on the action — giving their favorite podcast a viral boost by promoting it on social media, where it could reach new listeners.

To use the clip-sharing feature in Overcast, you first tap on the “share” button at the top-right corner of the app. You can then pick either an audio clip or a portrait, landscape or square video. In the clip-editing interface that appears, you can locate and select the audio clip you want to share. Clips can be up to one minute in length, the company says.

The variety of video formats is designed to appeal to those tasked with marketing a podcast across social media — including Twitter, Instagram, Facebook or Snapchat — where the supported video aspect ratios may vary. In addition, podcast marketers will be able to remove the Overcast branding from their shared clip to give it a more professional feel.

Overcast’s new feature competes with existing tools for marketing audio across social media — like those from Wavve, Headliner, Spotify-owned Anchor and others, including, perhaps, SoundCloud. Some of these services offer captions, as well, which podcasters may prefer to Overcast’s clips.

But unlike other rival tools, Overcast’s clip-sharing feature isn’t meant only for podcast creators and marketers — it’s for listeners, too.

Of course, that also could present a problem. Listeners who have an axe to grind could pull a clip that presents a podcaster in a bad light — perhaps, taking out of context something they said in hopes of manufacturing social media outrage. Or maybe they just catch the podcaster on a bad day saying something dumb. Small gaffes that in the past could have been overlooked could now be used against a podcaster because these viral clips are so easy to create and share.

Time will tell to what extent the feature is adopted and how it’s used, or if the idea makes its way to other apps to become more of a standard.

According to Overcast founder Marco Arment, the clip-sharing feature was inspired by a remark on the Unco podcast by Stephen Hackett, where the problem was discussed in more detail.

In addition to the launch of clips, Overcast’s public sharing page got a small refresh, too. It now features badges to other podcast apps and the RSS feed to the podcast for any show listed in Apple Podcasts.

“It’s important for me to promote other apps like this, and to make it easy even for other people’s customers to benefit from Overcast’s sharing features, because there are much bigger threats than letting other open-ecosystem podcast apps get a few more users,” Arment said.

That “much bigger threats” comment refers to the new trend of podcast “exclusives” — like those on Luminary or Spotify, which aren’t available to the public. Arguably, these aren’t podcasts in the strictest sense of the word — they’re audio programs.

The clips-sharing feature takes the opposite position. The podcasts this feature helps to promote are open and accessible to the public — and now all of the content inside each episode is more accessible, too.

Source: Tech Crunch