Close Menu
    Trending
    • Before You Invest, Take These Steps to Build a Strategy That Works
    • πŸ“š ScholarMate: An AI-Powered Learning Companion for Academic Documents | by ARNAV GOEL | Jun, 2025
    • Redesigning Customer Interactions: Human-AI Collaboration with Agentic AI
    • Want to Monetize Your Hobby? Here’s What You Need to Do.
    • Hopfield Neural Network. The main takeaway of this paper is a… | by bhagya | Jun, 2025
    • Postman Unveils Agent Mode: AI-Native Development Revolutionizes API Lifecycle
    • The Hidden Dangers of Earning Risk-Free Passive Income
    • Want to Be a Stronger Mentor? Start With These 4 Questions
    Finance StarGate
    • Home
    • Artificial Intelligence
    • AI Technology
    • Data Science
    • Machine Learning
    • Finance
    • Passive Income
    Finance StarGate
    Home»Machine Learning»Install Meta-Llama-3.1–8B-Instruct locally on your Macbook | by Anurag Arya | Apr, 2025
    Machine Learning

    Install Meta-Llama-3.1–8B-Instruct locally on your Macbook | by Anurag Arya | Apr, 2025

    FinanceStarGateBy FinanceStarGateApril 9, 2025No Comments2 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Share
    Facebook Twitter LinkedIn Pinterest Email


    Create a python file named install-llama-3.1–8b.py file with following code:

    from huggingface_hub import login
    from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
    import torch

    # Login to Hugging Face
    access_token_read = ""
    login(token=access_token_read)

    # Mannequin ID
    model_id = "meta-llama/Meta-Llama-3.1-8B-Instruct"

    # Load mannequin (easier model, no quantization)
    mannequin = AutoModelForCausalLM.from_pretrained(
    model_id,
    device_map="auto",
    torch_dtype=torch.float16 # Use bfloat16 or float16 if supported
    )

    # Load tokenizer
    tokenizer = AutoTokenizer.from_pretrained(model_id)

    # Create textual content technology pipeline
    text_gen = pipeline(
    "text-generation",
    mannequin=mannequin,
    tokenizer=tokenizer,
    pad_token_id=tokenizer.eos_token_id
    )

    # Take a look at the pipeline
    response = text_gen("what's the capital of France", max_new_tokens=100)
    print(response[0]['generated_text'])

    Log in to your Hugging Face account and generate an access token here with consumer and repository learn permissions.

    Run the script:

    python install-llama-3.1-8b.py

    Upon profitable execution, the script will:

    • Obtain the mannequin from hugging face repository into native cache (/Customers//.cache). Subsequent run onwards the mannequin can be loaded from the native cache.
    • Ship a immediate to the mannequin and show the response

    On this information, you’ve realized tips on how to arrange and run the Meta-LLaMA 3.1 8B Instruct mannequin regionally on a macOS machine utilizing Hugging Face Transformers, PyTorch. Operating LLMs regionally offers you extra management, privateness, and customisation energy.

    If you happen to’ve adopted the steps efficiently, it is best to now be capable to:

    • Load and run LLaMA 3.1 utilizing a easy Python script
    • Deal with massive fashions effectively with quantization
    • Generate textual content responses utilizing instruct-tuned prompts

    Subsequent Steps

    • Construct a chatbot or command-line assistant utilizing this mannequin
    • Discover immediate engineering to optimize outcomes
    • Experiment with multi-turn conversations



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleMy Bear Market Investment Game Plan: Adjusting the Strategy
    Next Article The enterprise path to agentic AI
    FinanceStarGate

    Related Posts

    Machine Learning

    πŸ“š ScholarMate: An AI-Powered Learning Companion for Academic Documents | by ARNAV GOEL | Jun, 2025

    June 4, 2025
    Machine Learning

    Hopfield Neural Network. The main takeaway of this paper is a… | by bhagya | Jun, 2025

    June 4, 2025
    Machine Learning

    The Next Frontier of Human Performance | by Lyrah | Jun, 2025

    June 4, 2025
    Add A Comment

    Comments are closed.

    Top Posts

    How AI Agent Development Bridge the Gap Between Humans & Machines?

    February 27, 2025

    What Is ‘AI Tasking’? Entrepreneurs Are Using This Viral Strategy to Save 3 Days a Week

    February 22, 2025

    Fly Like an Executive for a Year and Save up to 90% for Just $30

    February 25, 2025

    Reducing Time to Value for Data Science Projects: Part 2

    June 4, 2025

    AutoAgent: A Zero-Code Framework for LLM Agents β€” Exploring Its Multi-Agent Architecture and Self-Play Optimization Techniques | by QvickRead | AdvancedAI | Mar, 2025

    March 13, 2025
    Categories
    • AI Technology
    • Artificial Intelligence
    • Data Science
    • Finance
    • Machine Learning
    • Passive Income
    Most Popular

    Duos Edge AI Confirms EDC Deployment Goal in 2025

    May 15, 2025

    Top Data Quality Trends for 2025

    February 21, 2025

    Agentic RAG Applications: Company Knowledge Slack Agents

    May 30, 2025
    Our Picks

    The AI Dilemma: A Leap Forward or a Step Too Far? | by KRISHNA KUMAR VERMA | Mar, 2025

    March 10, 2025

    What First Names Are the Most Successful in Business?

    February 28, 2025

    People are Rethinking Their Microsoft 365 Subscriptions for This One-Time Purchase

    April 18, 2025
    Categories
    • AI Technology
    • Artificial Intelligence
    • Data Science
    • Finance
    • Machine Learning
    • Passive Income
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us
    Copyright Β© 2025 Financestargate.com All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.