Meta's AI Power Play: Llama 3 & Smart Reel Search

Meta is shaking up the AI world with its open-source Llama 3 408B model and a new AI-powered Reel search. Listen to the full episode to learn more.

Meta's AI Power Play: Llama 3 & Smart Reel Search

TL;DR

Meta is open-sourcing its massive 408B Llama 3 model to compete with OpenAI & Google, while its new AI Reel search is changing how we discover content. #VentureStep #AI #Meta

INTRODUCTION

The artificial intelligence arms race is accelerating, with major tech players vying for dominance. In a bold move, Meta is challenging the status quo by championing an open-source strategy for its most powerful AI models. This approach directly contrasts with the closed, proprietary systems of competitors like OpenAI and Google, potentially democratizing access to cutting-edge technology and fostering a new wave of innovation.

In this episode of Venture Step, host Dalton Anderson dives into Meta's two-pronged AI offensive. First, he breaks down the immense anticipation surrounding the release of Llama 3.1, a colossal 408-billion-parameter model poised to rival the industry's best. Dalton explains what makes this release so significant, from its performance benchmarks to the philosophical and practical advantages of making its architecture public.

Beyond the large-scale models, Dalton explores a new, highly practical AI tool: Meta's Reel search. He shares his firsthand experience using natural language to find specific video content for everything from fashion inspiration to market research for digital ads. This episode provides a comprehensive look at how Meta is leveraging AI at both the foundational and consumer-facing levels, offering powerful new tools for developers, creators, and everyday users alike.

KEY TAKEAWAYS

  • Meta's open-source strategy with Llama 3 aims to accelerate innovation by giving developers and researchers free access to a powerful, enterprise-level AI model.
  • The upcoming 408-billion-parameter Llama 3 model is positioned to directly compete with closed-source giants like OpenAI's GPT-4 and Google's Gemini.
  • Meta AI's new Reel search feature provides a powerful tool for specific content discovery, useful for everything from personal style inspiration to market research for product creatives.
  • Running the largest Llama 3 model locally will require significant computing power, likely an 800 GB download and a powerful multi-GPU setup.
  • As AI search becomes more prevalent, the opening frame of a video will become critically important for creators to capture attention from users no longer scrolling passively.

FULL CONVERSATION

Dalton: Welcome to Venture Step Podcasts where we discuss entrepreneurship, industry trends, and the occasional work review. Today we're discussing Meta's exciting release. We'll be briefly discussing their 408 billion parameter model that they are releasing later this week. Hopefully, it doesn't get delayed. I think there's been a lot of hype, so I'm pretty sure that Mark will release the model and it's going to be open source, which was confirmed. I think they're not releasing the model weights; there was some general pushback from the public like, "Hey, if it's open source, let's make sure we release the model weights." Meta also released a new feature on their Meta AI website where you can search Reels.

Dalton: I think that there are some interesting use cases and how it might be useful to you. I’ll share what I was using it for as well. Then, I’ll give some general updates on the success of the jobs and the nightmares I had working on my Nana's home the last couple of weeks. I had some scares, but we got through them. That's a general sense of what we'll be discussing today.

Dalton: Before we dive in, of course, my name's Dalton Anderson. I'm your host. My background is a bit of a mix of programming, data science, and insurance. Offline, you can find me running, building my side business, or lost in a good book. If you listen to podcasts in video and audio format, YouTube is the place to be. If audio is more your thing, you can find the podcasts on Apple Podcasts, Spotify, or wherever else you get your podcasts.

Meta's Game-Changing 408 Billion Parameter Model

Dalton: Okay, so Llama 3, 408 billion parameter model. Right now Llama 3 has a 70 billion and an eight-billion-parameter model. Those models performed very well against their counterparts. In some cases, they were even lumped in. Typically, they'll classify the models as small, medium, or large. A large model would be like this 400 billion parameter model, which we compared with other large models, but they don't necessarily have the same number of parameters. So in certain cases, this Llama 3 eight-billion-parameter model is compared to other models with 15 billion or 20 billion parameters.

How Llama 3 Stacks Up Against The Competition

Dalton: Their 8-billion-parameter model is mainly made to work on wearables like the Meta glasses. I know the Meta glasses use their small model, but I'm unsure whether Meta Quest uses the eight or the 70. I don't know. But basically, the models perform fairly well against their peers in the same category, with most of the time those respective Meta models having fewer parameters than their counterparts, which is pretty cool.

Dalton: I was impressed with how the models performed with the things I asked. It seemed generally pretty legit for only 70 billion parameters, because when you go to meta.com, you're only using the 70-billion-parameter model since the 408 hasn't been released yet.

X and Meta have taken the approach of putting large sums of money to train these AI models and then open sourcing them.

Dalton: And so if you go to meta.com, there's a big button in bold to download the models. All you have to do is fill out your name, who you are, and an email, and you can download the models. You just have to register yourself.

Hardware Requirements and Accessibility

Dalton: According to the leaks, there was some information about the 408-billion-parameter model, and it was a download payload of I think 800 gigs or something like that. So it's almost a terabyte of stuff that you have to download. And that's not even to run the model; that's just to have the model on your machine. You probably need to either combine the resources of multiple computers or have a really powerful GPU rig to use for compute. I haven't really looked into it that much, but I do know that that's too much to run on one computer. You could maybe combine two really good computers and be able to run it, but not to its full capabilities, obviously.

Dalton: That being said, you could pick up and download the 70-billion or the 8-billion-parameter model and make your own little app for free. An 8-billion-parameter model can be stored on watches and glasses and stuff like that.

What Are the Use Cases for Meta's Wearable AI?

Dalton: And that's what Meta uses for their Meta Ray-Ban glasses. That's the model that they use. People have said that they find it useful, and eventually, they're going to get it to the point of being able to see. You also give it permission, like, "Hey Meta, can you look at what I'm looking at? Can you tell me what year this car was produced?" or something like that. Maybe you're looking at an antique car. Or maybe you ask it how to make a certain type of dish. What ingredients does the dish typically have? Or maybe you don't even know the dish and you're like, what is this dish? And maybe it finds out for you.

Dalton: At one point, you'll be able to talk to your Meta and be like, "Hey, when's this place open up?" And then without going to your phone, Meta will search. Eventually, they would want it to search the internet, but I don't know where that's going to be with the glasses. Maybe it's connected to your phone via Bluetooth and somehow gives commands to your phone to search, gets the information back, and then Meta talks back to you. But that might take too long and you might as well just look it up on your phone. But long story short, eventually, Meta glasses will allow you to search for stuff on your face without you having to take out your phone, which is nice if you're down for that. I think it's pretty cool.

The Power of Open-Sourcing Large AI Models

Dalton: The 408-billion-parameter model is more of an enterprise-level model. This model would compete with ChatGPT 4.0, Gemini Advanced, or Pro 1.5. That's where this model is positioned when it gets released. It's supposed to be an open-source model that competes with OpenAI, Google, and Anthropic. Will it do that? I don't know. From the leaks, it seems like it's going to be very good. I'm not completely sure. There's a lot of hype in AI, and sometimes I've gotten caught, especially with Google's AI announcement where they staged it. There's also that AI, Devin, the AI code bot that is supposed to be super good at coding; they staged that demo.

I don't really trust things until I look at it myself nowadays, unless I can get my hands on it or there's been some independent reviews of these releases.

Dalton: But we'll see soon because hopefully it's going to be released in the coming days. The potential of it is you allow researchers or students at a university to play around with these models for free. It gives people an idea playground, I would say.

You have the ability to play around with the model... for free and you get to be able to see the weights, you get to see how it was constructed, you get to see everything, which is not something you get to do with closed source models.

Dalton: And I think the general consensus is, hey, open-source things are better. There are a lot more good people in the world than there are bad people. And if you open-source things, there are a lot more eyes and people can see what's going on. Security defects or leaks can be fixed a lot faster because there are a lot more people seeing and using it. When you have things open source, it just makes the maintenance of the code base a little bit easier because the coding public has an interest in it. So it's just overall better; open source allows things to move quicker.

The Licensing Model: Free for Some, Not for All

Dalton: I think the license is for everything except non-commercial use. You can't start a business for it, but you can. I think that it hasn't been finalized. I think it's going to be more relatable when the 408-billion-parameter model comes out because the first two are great, but they're not amazing. Once we get to that level, I think they'll provide some clear guidelines. I know that Mark was talking about not wanting a large massive company to be using this model for free. He said if they're big, he's going to make them pay.

...if some small shop is trying to get started and want to use our model to build their product, then we could let them do that for, you know, if their revenue is under like 10 million or something like that.

Dalton: So I think it depends on the size of the company whether it's free or not. I think that's the structure of the license. But those things are subject to change all the time, so you have to be careful.

Introducing Reel Search: A New Way to Discover Content

Dalton: Okay, so Reel search. With Meta AI, if you go to the Meta AI website, you can now search for Reels. I was using it at first for fashion. I got to get my fashion up. I may have a lot of things going for myself that are positive, but one thing I don't really have going on is some fashion. My fashion sense and my ability to properly structure outfits with a defined stylistic view is poor. So I'm trying to become more fashionable, which is a difficult thing when you're starting from zero.

A Personal Use Case: Improving Fashion Sense with AI

Dalton: You don't really know who you are, what you like. So you had to go through this whole thing where you have to look up different types of styles, figure out the defining color palette of that style, and figure out if you can combine different styles to make your personal style. You don't want to just copy-paste other people's stuff; you want to make your own identity. You could use Pinterest to do that, or you could use this Meta AI where you could search for the style that you want, which is pretty cool. I think I still like Pinterest more because you can save it, but you can get general Reels from this searchability. Or you could ask it, "what are some home decorating ideas?" and bam, it pulls up popular Reels.

Dalton: From there, you can find the suggested posts that are related to that post. It might find you an area where you might want to start, but you don't necessarily know where to find that first Reel or that first video.

Using Reel Search for Market Research and Creatives

Dalton: You could search via Meta AI, like, "I want to see stuff relating to home decor," and it shows you Reels. I'm sharing my screen for those watching the video. I asked Meta AI, "Can you show me Reels of creatives for products in the home space?" A creative is just a video ad. It sent me stuff related to different ideas like decor inspiration, home organization, and smart home tech. Then I said, "please show me Reels," and it shows me Reels of people cooking, doing some organization things. All of these Reels might not be exactly ads, but they are advertising a product with sneaky advertising creatives.

Dalton: When the models become more advanced, maybe even this 408-billion-parameter model, when I ask it these things, it will know exactly what I'm looking for. It scours the comments, scours the posts on Instagram, and knows how one thing is an ad while another is not, even though it really is an ad through organic marketing. And it populates that stuff for me because that would save some time.

...you can use this feature to save you time on discovering what kind of creatives you want.

Dalton: I don't think it would save time on organic product discovery if you want to just scroll through, but you could ask Meta, "Hey, can you provide me creatives that are in this space that are labeled as ads or seem like they're ads?" And then Meta would just be like, "Yeah, sure thing." Then you can just watch all these ads and say, "Okay, I like what these people did. The way that they presented this is very nice." It makes it a little bit easier to figure out what creatives you want and get new ideas.

Dalton: I think it's pretty cool and going to be useful not only for product discovery but for other things as well, like personal examples like getting my fashion better. You can also use it for birthday ideas, places to travel, and those kinds of things. You could see actual videos of where people are going, what they're doing, best restaurants in Tampa. If I search "best places to eat in Florida," let's see what they say. I will say that to query Reels, it does take a while, maybe 40 seconds.

The Future Implications for Content Creators

It gives an ability for people to find content that they're really searching for instead of scouring for it.

Dalton: And so maybe they don't have to watch as much stuff anymore; they could just find exactly what they want. I don't need to scroll through 12 Reels or 12 TikToks to get to what I need. All I need to do is find the one that I like. It shows about 10 reels per query. I think the implication for creators is that you need to have a really good selling frame.

...in the future, you're going to need to have a really good selling frame, like your starting frame for your Tik TOK or real needs to be good.

Dalton: If people start using this feature, I'm just not going to click on a lot of them. I'm not even going to see it because it doesn't look interesting.

A Quick Update: The Trials of Home Remodeling

Dalton: All right. So house update. I am almost finished with helping remodel my nana's home. I had some help from someone to work on the house and he did the electrical, which I don't know how to do. I did the closets, which sadly aren't fully done yet, but they're getting there. They're so close. The guy who worked on the house finished the high hats, which makes me glow in this podcast video because there's so much light now.

Dalton: He also helped fix my attic. My attic didn't have as much insulation as it needed. And there were no gable vents. We did have gable vents, but for some reason, someone closed them. Basically, the gable is when the two slants on a house meet and it makes a little triangle. There should normally be a vent in that area to allow the hot air to release from your attic. Hot air rises and goes through the vent, and the vent then circulates the cold air in passively, which cools down the attic. For some reason, someone had closed our vents and then covered them up. So he helped cut out the framing of the house, cut a hole through the siding, picture frame it, put a vent out there, and waterproof it. Which is great for me because the house is a lot cooler than it normally is.

Dalton: But we did have a plumbing scare. We were replacing our shower trim, which is basically the valve inside the wall, the shower arm, and the shower head. Lewis took out the drain, and the original drain broke upon removal, which is pretty common for older drains. My drain was probably 50 years old. It broke, and my new drain didn't fit. Other drains didn't fit. I was in this position where if I couldn't find a drain, I couldn't use my shower and I would have to pay for him to either chip underneath the tub to disconnect the drain pipe or have him rip out everything and redo it. Both of those things were out of scope.

Dalton: Luckily, I found a drain that fit my dimensions, which was awesome because it saved a lot of money that I didn't want to spend. The problem was the thread width; the new drains were too big, so I couldn't really make it smaller. But I eventually found one, it worked out, and everything is a happy story.

Dalton: So I'm excited. Next week, we're going to be discussing the 408 billion parameter model. Hopefully, we'll have a live demo of the model during the episode, which would be great. And of course, before we end it out, have a great day, a good morning, good afternoon, wherever you are in this world, and I hope that you're listening next week. Have a great day. Thank you for listening.

RESOURCES MENTIONED

  • Meta
  • X.com / Twitter
  • YouTube
  • Apple Podcasts
  • Spotify
  • Meta Ray-Ban glasses
  • Meta Quest 3
  • OpenAI
  • Google
  • Anthropic
  • Devin AI
  • Pinterest
  • Instagram
  • TikTok
  • Dyson

INDEX OF CONCEPTS

Dalton Anderson, Meta, Llama 3, 408 billion parameter model, 70 billion parameter model, 8 billion parameter model, Mark Zuckerberg, Meta AI, Reel Search, X.com, Twitter, Meta Ray-Ban glasses, Meta Quest 3, OpenAI, GPT-4, Google, Gemini Advanced, Gemini Pro 1.5, Anthropic, Devin AI, Pinterest, Instagram, TikTok, Dyson, open source AI, model weights, GPU, AI wearables, creatives, home decor, fashion, gable vents, plumber's putty, shower trim