Lily raises $2M from NEA and others for a personal stylist service that considers feelings, not just fit

One of the reasons recently IPO’d Stitch Fix became so popular among female shoppers is because of how it pairs the convenience of home try-on for clothing and accessories with a personal styling service that adapts to your tastes over time. But often, personal stylists bring their own subjective takes on fashion to their customers. A new startup called Lily aims to offer a more personalized service that takes into account not just what’s on trend or what looks good, but also how women feel about their bodies and how the right clothing can impact those perceptions.

The company has now closed on $2 million in seed funding from NEA and other investors to further develop its technology, which today involves an iOS application, web app and API platform that retailers can integrate with their own catalogs and digital storefronts.

To better understand a woman’s personal preferences around fashion, Lily uses a combination of algorithms and machine learning techniques to recommend clothing that fits, flatters and makes a woman feel good.

At the start, Lily asks the user a few basic questions about body type and style preferences, but it also asks women how perceive their body.

For example, if Lily asks about bra size, it wouldn’t just ask for the size a woman wears, but also how they think of this body part.

“I’m well-endowed,” a woman might respond, even if she’s only a full B or smaller C – which is not necessarily the reality. This sort of response helps to teach Lily about how the woman thinks of her body and its various parts, to help it craft its recommendations. That same woman may want to minimize her chest, or she may like to show off her cleavage, she may say.

But as she shops Lily’s recommendations in this area, the service learns what sorts of items the woman actually chooses and then adapts accordingly.

This focus on understanding women’s feelings about clothing is something that sets Lily apart.

“Women are looking for clothes to spotlight the parts of their body they feel most comfortable with and hide the ones that make them feel insecure,” explains Lily co-founder and CEO, Purva Gupta. “A customer makes a decision because based on whether a specific cut will hide her belly or downplay a feature they don’t like. Yet stores do nothing to guide women toward these preferences or take the time to understand the reasons behind their selections,” she says.

Gupta came up with the idea for Lily after moving to New York from India, where she felt overwhelmed by the foreign shopping culture. She was surrounded by so much choice, but didn’t know how to find the clothing that would fit her well, or those items that would make her feel good when wearing them.

She wondered if her intimidation was something American women – not just immigrants like herself – also felt. For a year, Gupta interviewed others, asking them one question: what prompted them to buy the last item of clothing they purchased, either online or offline? She learned that those choices were often prompted by emotions.

Being able to create a service that could match up the right clothing based on those feelings was a huge challenge, however.

“I knew that this was a very hard problem, and this was a technology problem,” says Gupta. “There’s only one way to solve this at scale – to use technology, especially artificial intelligence, deep learning and machine learning. That’s going to help me do this at scale at any store.”

To train Lily’s algorithms, the company spent two-and-half years building out its collection of 50 million plus data points and analyzing over a million product recommendations for users. The end result is that an individual item of clothing may have over 1,000 attributes assigned to it, which is then used to match up with the thousands of attributes associated with the user in question.

“This level of detail is not available anywhere,” notes Gupta.

In Lily’s app, which works as something of a demo of the technology at hand, users can shop recommendations from 60 stores, ranging from Forever 21 to Nordstrom, in terms of price. (Lily today makes affiliate revenue from sales).

In addition, the company is now beginning to pilot its technology with a handful of retailers on their own sites – details it plans to announce in a few months’ time. This will allow shoppers to get unique, personalized recommendations online that could also be translated to the offline store in the form of reserved items awaiting you when you’re out shopping.

Though it’s early days for Lily, its hypothesis is proving correct, says Gupta.

“We’ve seen between 10x to 20x conversion rates,” she claims. “That’s what’s very exciting and promising, and why these big retailers are talking to us.”

The pilot tests are paid, but the pricing details for Lily’s service for retailers are not yet set in stone so the company declined to speak about them.

The startup was also co-founded by CTO Sowmiya Chocka Narayanan, previously of Box and Pocket Gems. It’s now a team of 16 full-time in Palo Alto.

In addition to NEA, other backers include Global Founders Capital, Triplepoint Capital, Think + Ventures, Varsha Rao (Ex-COO of Airbnb, COO of Clover Health), Geoff Donaker (Ex-COO of Yelp), Jed Nachman (COO, Yelp), Unshackled Ventures and others.

Read more:

TrueFace.AI is here to catch the facial recognition tricksters

TrueFace.AI knows if it's looking at a real face or just a photo of one.
Image: ian waldie/Getty Images

Facial recognition technology is more prevalent than ever before. It’s being used to identify people in airports, put a stop to child sex trafficking, and shame jaywalkers.

But the technology isn’t perfect. One major flaw: It sometimes can’t tell the difference between a living person’s face and a photo of that person held up in front of a scanner.

TrueFace.AI facial recognition is trying to fix that flaw. Launched on Product Hunt in June, it’s meant to detect “picture attacks.”

The company originally created Chui in 2014 to work with customized smart homes. Then they realized clients were using it more for security purposes, and TrueFace.AI was born.

Shaun Moore, one of the creators of TrueFace.AI, gave us some more insight into the technology.

“We saw an opportunity to expand our reach further and support use cases from ATM identity verification to access control for data centers,” said Moore. “The only way we could reach scale across industries would be by stripping out the core tech and building a platform that allows anyone to use the technology we developed.”

“We knew we had to focus on spoof detection and how we could lower false positives.”

TrueFace.AI can detect when a face or multiple faces are present in a frame and get 68 raw points for facial recognition. But its more unique feature is spoof detection, which can tell real faces from photos.

“While working on our hardware, we tested and used every major facial recognition provider. We believe that doing that (testing every solution available) and applying facial recognition to a very hard use case, like access control and the smart home, allowed us to make a better, more applicable solution,” said Moore. “All of these steps led us to understand how we could effectively deploy technology like ours in a commercial environment.”

They made their final product by using deep learning. They trained classifiers with thousands of attack examples they collected over the years, and liked the results.

A “freemium” package is available to encourage the development community that helped TrueFace.AI come up with a solution. Beyond that, the Startup Package is $99 per month while the Scale Package is $199 per month. An Enterprise Plan is available via a custom agreement with TrueFace.AI.

While Moore couldn’t divulge exactly which companies are using the technology, he did say some of them are in the banking, telecommunications, and health care industries.

It’s a service that could become increasingly valuable as companies turn to facial recognition technology.

Read more:

Apple working on dedicated AI chip for iOS devices, report says

Image: lili sams/mashable

Apple’s reportedly working on a new kind of chip potentially for future iOS devices that’ll be used just for processing AI, Bloomberg reports.

Bloomberg says the chip’s called the Apple Neural Engine internally, and could be used for “offloading facial recognition in the photos application, some parts of speech recognition, and the iPhones predictive keyboard to the chip.”

By moving AI processing to a dedicated chip, battery life in devices could also see a boost since the main CPU and GPU wouldn’t be crunching as much data and gobbling as much power.

The report says Apple plans to integrate the chip into its devices, but it’s unclear when that’ll happen, and if any iOS devices launching this year will have it.

Apple’s work on an AI chip shouldn’t surprise anyone who’s paying attention to the competition. Virtually every tech company is working on improving AI processing on mobile devices.

Qualcomm’s latest Snapdragon 835 chip, which is already in devices like the Samsung Galaxy S8, has a special module dedicated to processing AI tasks.

Years ago, Apple started designing its own mobile processors to improve performance and reduce power consumption, and it’s really paid off.

Despite having fewer cores, the iPhone 7 still crushes the Galaxy S8 when it comes to sheer performance.

iPhones and iPads also come with an Apple-designed “M”-branded motion coprocessor to collect sensor data from the various included sensors (accelerometer, gyroscope, compass, etc.). It’s this M chip that helps with tracking health and fitness data.

Furthermore, in addition to the main Intel processors in the new MacBook Pros, there’s also a small Apple-made “T1” chip for powering the Touch Bar. Apple’s AirPods also have a custom W1 chip that helps with pairing them to iOS devices.

Clearly, Apple loves making custom chips for things. We’re all for it, especially if that means longer battery life.

If the future is AI everywhere (and it definitely looks like that’s where things are headed), it’s in Apple’s best interests to control the stack (like it always does) with its own AI chip.

Read more: