Close Menu
    Trending
    • What If Your Portfolio Could Speak for You? | by Lusha Wang | Jun, 2025
    • High Paying, Six Figure Jobs For Recent Graduates: Report
    • What If I had AI in 2018: Rent the Runway Fulfillment Center Optimization
    • YouBot: Understanding YouTube Comments and Chatting Intelligently — An Engineer’s Perspective | by Sercan Teyhani | Jun, 2025
    • Inspiring Quotes From Brian Wilson of The Beach Boys
    • AI Is Not a Black Box (Relatively Speaking)
    • From Accidents to Actuarial Accuracy: The Role of Assumption Validation in Insurance Claim Amount Prediction Using Linear Regression | by Ved Prakash | Jun, 2025
    • I Wish Every Entrepreneur Had a Dad Like Mine — Here’s Why
    Finance StarGate
    • Home
    • Artificial Intelligence
    • AI Technology
    • Data Science
    • Machine Learning
    • Finance
    • Passive Income
    Finance StarGate
    Home»Machine Learning»Selection of the Loss Functions for Logistic Regression | by Rebecca Li | Mar, 2025
    Machine Learning

    Selection of the Loss Functions for Logistic Regression | by Rebecca Li | Mar, 2025

    FinanceStarGateBy FinanceStarGateMarch 8, 2025No Comments2 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Share
    Facebook Twitter LinkedIn Pinterest Email


    To raised perceive the distinction, let’s visualize how MSE vs. Cross-Entropy Loss behave for classification.

    Now that we determined to make use of the Cross Entropy for loss operate. To make use of the loss operate to information the modifications of the weights and bias, we have to take the gradient of the loss operate.

    """
    ------ Psudo Code------
    # Practice logistic regression mannequin
    mannequin = LogisticRegression()
    mannequin.match(X_train, y_train)
    # Predict possibilities for take a look at set
    y_probs = mannequin.predict_proba(X_test)[:, 1] # Likelihood for sophistication 1 (related)
    # Rank objects primarily based on predicted possibilities
    rating = np.argsort(-y_probs) # Destructive signal for descending order
    ------ Fundamental operate ------
    """import numpy as np
    """
    Fundamental operate
    """
    def pred (X,w):
    # X: Enter options, w: Weights
    # y hat = sig (1/ 1+e^-z), z = wx +b
    z = np.matmul(X,w)
    y_hat = 1/(1+np.exp(-z))
    return y_hat
    def loss(X,Y,w):
    # Compute the binary cross-entropy loss
    # Loss just isn't instantly used within the practice,g however its gradient is used to replace the weights
    y_pred = pred(X, w)
    sum = - Y * np.log(y_pred) + (1-Y) * np.log(1-y_pred)
    return - np.imply (sum)
    def gradient (X,Y,w):
    # Spinoff of Loss to w and b
    # The gradient of the loss operate tells us the path and magnitude during which the mannequin’s parameters needs to be adjusted to attenuate the loss.
    y_pred = pred(X, w)
    g = - np.matmul( X.T, (y_pred- Y) ) / X.form[0]
    return g
    """
    Part of coaching and testing
    """
    def practice(X,Y, iter= 10000, learning_rate = 0.002):
    w = np.zeros((X.form[1],1)) # w0 intialization at Zero
    for i in vary(iter):
    w = w - learning_rate * gradient(X,Y,w)
    y_pred = pred(X,w)
    if i % 1000 == 0:
    print (f"iteration {i}, loss = { loss(X,Y,w)}" )
    return wdef take a look at(X,Y, w):
    y_pred = pred(X,w)
    y_pred_labels = (y_pred > 0.5).astype(int)
    accuracy = np.imply(y_pred_labels == Y)
    return accuracy



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleAPI Security Testing: Best Practices for Penetration Testing APIs
    Next Article Using GPT-4 for Personal Styling
    FinanceStarGate

    Related Posts

    Machine Learning

    What If Your Portfolio Could Speak for You? | by Lusha Wang | Jun, 2025

    June 14, 2025
    Machine Learning

    YouBot: Understanding YouTube Comments and Chatting Intelligently — An Engineer’s Perspective | by Sercan Teyhani | Jun, 2025

    June 13, 2025
    Machine Learning

    From Accidents to Actuarial Accuracy: The Role of Assumption Validation in Insurance Claim Amount Prediction Using Linear Regression | by Ved Prakash | Jun, 2025

    June 13, 2025
    Add A Comment

    Comments are closed.

    Top Posts

    Teen Builds $30 Million App, Gets Rejected By 15 Colleges

    April 8, 2025

    Quantum Intelligence Delivering Cutting-Edge Insights for Modern Visionaries | by Rah Tech Wiz (she, her) | Mar, 2025

    March 5, 2025

    The Top 10 Highest-Paying AI Side Hustles You Can Start Now

    March 16, 2025

    How to Create Network Graph Visualizations in Microsoft PowerBI

    February 7, 2025

    How to Develop Complex DAX Expressions

    March 11, 2025
    Categories
    • AI Technology
    • Artificial Intelligence
    • Data Science
    • Finance
    • Machine Learning
    • Passive Income
    Most Popular

    Universal Fine-Tuning Framework (UFTF): A Versatile and Efficient Approach to Fine-Tuning Language Models | by Frank Morales Aguilera | AI Simplified in Plain English | Mar, 2025

    March 3, 2025

    Waymo Reports Robotaxis Are Booked 250,000 Times a Week

    April 27, 2025

    NVIDIA to Manufacture AI Supercomputers in U.S.

    April 14, 2025
    Our Picks

    LettuceDetect: A Hallucination Detection Framework for RAG Applications

    March 11, 2025

    Traveling Professionals: Add This MacBook Air to Your Carry-on for Less Than $200

    March 29, 2025

    Get a $20 Digital Shop Card With a New $65 Costco Gold Star Membership

    April 22, 2025
    Categories
    • AI Technology
    • Artificial Intelligence
    • Data Science
    • Finance
    • Machine Learning
    • Passive Income
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us
    Copyright © 2025 Financestargate.com All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.