LinkedIn has gone beyond 1 billion members and has also launched an AI intelligence chatbot charged as a “job seeker coach” and other generative man-made intelligence tools for Premium members.
The organization had on the low, been unveiling AI devices as it centers around automated recruiter messages and job descriptions. 14 days ago, LinkedIn announced almost 700 job cuts, with generally coming from the engineering unit.
The professional network platform debuted an artificial intelligence-powered chatbot on Wednesday November 1. For months, the Microsoft-owned company has been gearing its attention on tools like automated recruiter messages, job descriptions and AI-powered profile writing suggestions.
The new simulated intelligence chatbot, which points to a limited extent to assist users with checking whether an employment form merits their time, is controlled by OpenAI’s GPT-4 and began rolling out to some Premium users on Wednesday. Microsoft has invested billions of dollars into OpenAI.
LinkedIn’s engineering team had to invest heavily on the platform side to reduce latency, as indicated by Erran Berger, LinkedIn’s vice president of product engineering.
As indicated by Berger, they needed to construct a ton of stuff to work around that and to make this a smart experience. He proceeded with that, while you’re having these conversational experience, sometimes it’s almost like search — you expect it to be instant. And so there’s real platform capabilities they had to develop to make that possible.
LinkedIn is doing all it can to reaccelerate income growth after eight straight quarters of easing back extension.
Users of the new chatbot can launch it from a job posting by selecting one of a few questions, such as “Am I a good fit for this job?” and “How can I best position myself for this job?”
The former would prompt the tool to analyze a user’s LinkedIn profile and experience, with answers like, “Your profile shows that you have extensive experience in marketing and event planning, which is relevant for this role.”
The chatbot will also feature expected holes in a user’s experience that could hurt them in the employment form process.
The user can also follow up by asking who works at the organization, which will incite the chatbot to send them a couple of worker profiles — possibly second-or third-degree associations — who the user can then message about the opportunity. The actual message can also be drafted using generative computer based intelligence.
Previously, many uses of AI in hiring or job applications have faced criticism for bias against marginalized communities. One example was Amazon’s use of a recruiting engine that reportedly downvoted resumes that included the word “women” or mentioned women’s colleges.
A separate study by the Harvard Business Review highlighted bias against black candidates in an analysis of respondents’ job board recommendations.
According to Erran Berger, LinkedIn’s vice president of product engineering, they have invested a lot to make sure that this stays kind of within the guardrails of what meets their responsible AI standards.