Close Menu
    Trending
    • What is Artificial Intelligence? A Non-Technical Guide for 2025 | by Manikesh Tripathi | Jun, 2025
    • Here’s What Keeps Google’s DeepMind CEO Up At Night About AI
    • Building a Modern Dashboard with Python and Gradio
    • When I Realize That Even the People Who Build AI Don’t Fully Understand How They Make Decisions | by Shravan Kumar | Jun, 2025
    • Reddit Sues AI Startup Anthropic Over Alleged AI Training
    • The Journey from Jupyter to Programmer: A Quick-Start Guide
    • Should You Switch from Scikit-learn to PyTorch for GPU-Accelerated Machine Learning? | by ThamizhElango Natarajan | Jun, 2025
    • Before You Invest, Take These Steps to Build a Strategy That Works
    Finance StarGate
    • Home
    • Artificial Intelligence
    • AI Technology
    • Data Science
    • Machine Learning
    • Finance
    • Passive Income
    Finance StarGate
    Home»Machine Learning»How Eigenfaces Got Me Hooked on Machine Learning | by TensorNomad | Apr, 2025
    Machine Learning

    How Eigenfaces Got Me Hooked on Machine Learning | by TensorNomad | Apr, 2025

    FinanceStarGateBy FinanceStarGateApril 9, 2025No Comments5 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Share
    Facebook Twitter LinkedIn Pinterest Email


    Technique 2

    this can be a bit higher now the next modifications I do will ::
    Lets us add a picture(manually or by way of code) and it tels us if the picture we add is there or not and show the recognised picture with overlaid visuals

    Hold your load_images(), compute_pca(), project_face() as they’re
    and we do that then :

    So lets compute the PCA :

    def compute_pca(photos, num_components=50):
    #Compute the common face (used to heart knowledge)
    mean_face = np.imply(photos, axis=0)

    #Subtract common from every picture (heart knowledge)
    centered_images = photos - mean_face

    # Compute small covariance matrix (L = X Xᵗ) → quicker if few samples
    L = np.dot(centered_images, centered_images.T)

    # Eigen-decomposition of the smaller matrix
    eigenvalues, eigenvectors_small = np.linalg.eigh(L)

    # Kind eigenvectors by reducing eigenvalue
    idx = np.argsort(-eigenvalues)
    eigenvectors_small = eigenvectors_small[:, idx]

    # Convert small-space eigenvectors to full image-space eigenvectors
    eigenvectors = np.dot(centered_images.T, eigenvectors_small)

    # Take solely the highest `num_components` eigenfaces
    eigenfaces = eigenvectors[:, :num_components]

    # Normalize every eigenface (make every vector unit size)
    for i in vary(eigenfaces.form[1]):
    eigenfaces[:, i] /= np.linalg.norm(eigenfaces[:, i])

    # Return the common face and the precise eigenfaces
    return mean_face, eigenfaces

    how is that this completely different ? Effectively this can be a trick to calculate the PCA
    Usually, PCA does some heavy math on an enormous matrix primarily based on each single pixel within the picture — and that will get sluggish actually quick when photos are huge. However this technique flips the script. As a substitute of working with all of the pixels immediately, it seems on the variety of photos (which is normally a lot smaller), does the maths there, after which cleverly interprets the outcome again to work with the complete picture dimension. You get the identical ultimate outcome — the “eigenfaces” — nevertheless it runs a lot faster and makes use of much less reminiscence. It’s like fixing a giant downside by working with a shortcut that also provides the proper reply.

    Cool proper ?

    okay now

    we change the def recognize_face() with this new operate

    def predict_face(picture, mean_face, eigenfaces, photos, labels, label_dict, threshold=3000):
    picture = cv2.resize(picture, (100, 100)).flatten()
    projected_test = project_face(picture, mean_face, eigenfaces)
    projected_train = np.dot(photos - mean_face, eigenfaces)

    distances = np.linalg.norm(projected_train - projected_test, axis=1)
    best_match = np.argmin(distances)
    best_distance = distances[best_match]

    print(f" Finest Match Distance: {best_distance:.2f}")

    if best_distance return label_dict[labels[best_match]], best_distance
    else:
    return "Unknown", best_distance

    earlier than vizualisign ask your self what is definitely taking place within the predict_face operate :
    The predict_face() operate is the place the precise face recognition magic occurs. Consider it like this: your program already is aware of a bunch of faces (from the dataset), and now you’re exhibiting it a brand new one and asking, “Hey, do you acknowledge this particular person?”

    To reply that, the operate doesn’t simply take a look at the uncooked picture. As a substitute, it tasks each the recognized faces and the brand new one right into a compressed model of the face world utilizing PCA — we name this the eigenface house. It’s sort of like shrinking the face down into simply its most vital options.

    Now right here’s the important thing half: it calculates how far aside the brand new face is from every recognized face in that compressed house. The nearer the match (i.e., the smaller the space), the extra seemingly it’s that the face is acknowledged.
    Now what does this threshold factor imply ?
    That’s the system’s manner of claiming, “I’ll solely say I acknowledge somebody if I’m actually certain.”

    If one of the best match is throughout the threshold (i.e., shut sufficient), it says: “Sure, I acknowledge this face!” If it’s too distant, it performs it protected and says: “Sorry, I don’t know this particular person.” This prevents this system from randomly guessing somebody’s identification even when the face doesn’t actually match — it’s like setting a minimal stage of confidence.
    Think about you’re making an attempt to acknowledge somebody at a distance. If they give the impression of being loads like your good friend, you wave and say hello. However if you happen to’re not fairly certain — possibly it’s a little bit blurry or they only don’t appear proper — you hesitate and suppose, “Hmm, possibly not.”

    That hesitation is your mind utilizing a “threshold.” In code, we give the pc the identical possibility.

    and since we’ve completed this we should visualize the outcome additionally proper? so

    import matplotlib.pyplot as plt

    def show_result(picture, label, distance):
    plt.imshow(picture, cmap='grey')
    plt.title(f"Prediction: {label} (Distance: {distance:.2f})")
    plt.axis('off')
    plt.present()

    after which once more we load the info and prepare :

    # Load dataset and prepare mannequin
    photos, labels, label_dict = load_images(dataset_path)
    mean_face, eigenfaces = compute_pca(photos)

    anf we adda full operate which makes a show_result() operate and in addition and now we add a picture of our personal and test if that is what we need

    def show_result(picture, label, distance, threshold=3000):
    plt.determine(figsize=(5,5))
    plt.imshow(picture, cmap='grey')

    if label == "Unknown" or distance > threshold:
    title_color = 'purple'
    message = f" Not RecognizednDistance: {distance:.2f}"
    else:
    title_color = 'inexperienced'
    message = f" Acknowledged: {label}nDistance: {distance:.2f}"

    plt.title(message, coloration=title_color, fontsize=12)
    plt.axis('off')
    plt.present()

    from google.colab import information
    uploaded = information.add()

    for fname in uploaded.keys():
    test_img = cv2.imread(fname, cv2.IMREAD_GRAYSCALE)
    outcome, dist = predict_face(test_img, mean_face, eigenfaces, photos, labels, label_dict)
    show_result(test_img, outcome, dist) # ✔️ Now shows clear label & message

    now lets check out the outputs ::
    test with a recognized knowledge

    viola ! there you have got it … this man is allowed within the mess corridor

    however an unknow particular person

    sorry Mr. Allen seems like you aren’t within the listing so no entry within the mess for you 🙂🙂

    Paul Allen can nonetheless have some meals although ….. OwO ( I really like Paul Allen)

    and oh sure earlier than I go away I’ll give the entire code right here and that’s it for immediately and I can be posting extra exiting issues. In comparison with these that is only a pebble within the sea.

    The journey’s simply begun. I’ll see you within the subsequent dimension — TensorNomad out.



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleThe Rise of Autonomous AI Agents: How They Differ from Traditional Chatbots
    Next Article Mining Rules from Data | Towards Data Science
    FinanceStarGate

    Related Posts

    Machine Learning

    What is Artificial Intelligence? A Non-Technical Guide for 2025 | by Manikesh Tripathi | Jun, 2025

    June 5, 2025
    Machine Learning

    When I Realize That Even the People Who Build AI Don’t Fully Understand How They Make Decisions | by Shravan Kumar | Jun, 2025

    June 5, 2025
    Machine Learning

    Should You Switch from Scikit-learn to PyTorch for GPU-Accelerated Machine Learning? | by ThamizhElango Natarajan | Jun, 2025

    June 5, 2025
    Add A Comment

    Comments are closed.

    Top Posts

    IBM Adds Granite 3.2 LLMs for Multi-Modal AI and Reasoning

    February 26, 2025

    09391321841 – شماره تماس – Medium

    May 5, 2025

    Natural Language Processing. Chapter 1. NLP: A Primer | by Fatma Ismayilova | Feb, 2025

    February 14, 2025

    Why the CEO of Thomson Reuters Is Betting Big on AI

    April 15, 2025

    Optical Proximity Correction in the Manufacturing of Integrated Circuits — Part 2 | by Janhavi Giri | Mar, 2025

    March 2, 2025
    Categories
    • AI Technology
    • Artificial Intelligence
    • Data Science
    • Finance
    • Machine Learning
    • Passive Income
    Most Popular

    The Ultimate Machine Learning Roadmap: Where Should You Focus? | by HIYA CHATTERJEE | Apr, 2025

    April 18, 2025

    Confront Underperforming Employees With Confidence By Following This Guide to Effective Accountability

    March 25, 2025

    Log Link vs Log Transformation in R — The Difference that Misleads Your Entire Data Analysis

    May 10, 2025
    Our Picks

    What’s Your Hacker Name? Tale of Weak passwords | by Zeeshan Saghir | Apr, 2025

    April 3, 2025

    When You Just Can’t Decide on a Single Action

    March 8, 2025

    Real-Time Interactive Sentiment Analysis in Python

    May 8, 2025
    Categories
    • AI Technology
    • Artificial Intelligence
    • Data Science
    • Finance
    • Machine Learning
    • Passive Income
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us
    Copyright © 2025 Financestargate.com All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.