Black Friday Kicks Off: How to Navigate the Latin American Market?
Nov 20, 2024 10:36 AM
Exploring Uncharted Territories in the Middle East: The Innovators Going Global
Nov 19, 2024 03:20 PM
On April 25, code-hosting platform giant GitLab released an AI-powered security feature based on a large language model to explain potential vulnerabilities in the development process to users. The company reportedly plans to continue expanding the security feature to meet its strategic goal of using artificial intelligence to automate security issues.
code
The security feature finds the best way to fix vulnerabilities in the code base by combining basic information about potential vulnerabilities with code-specific insights. The combination of AI code and security testing, supported by GitLab's full-stack DevSecOps platform, balances the security of AI-generated content with the efficiency of deployment and increases developer productivity. In addition, with user privacy and compliance in mind, GitLab COO David DeSanto said, "If it touches customer code, the feature will only send it to specific models within the GitLab cloud architecture and will not leak to third-party AI. Also, we will not use any user's private data to train the model. "
According to the report, GitLab's AI will make the entire development lifecycle ten times more efficient, while DeSanto also pointed out that if you just increase developer productivity by 100 times, subsequent inefficiencies when you put it into production will also pull down the overall efficiency. So Gitlab wants to improve the overall efficiency of the security team, the operations team, and the compliance team, and the newly released security features are an integral part of that.
Currently, the "interpreted code" feature is being used to help security departments meet daily testing goals and serve developers. In the long term, GitLab hopes to build better features to help teams automate unit testing and security reviews and integrate these features into the GitLab platform.
According to the latest DevSecOps report, 65 percent of developers use or plan to use AI and machine learning in their testing efforts, and 36 percent of teams use AI/ML tools to help review code. In a mid-March statement, GitLab wrote: "Given the resource constraints we currently face, automation and AI have become the new strategic resources, our DevSecOps platform can help teams fill critical gaps while automating policies and applying compliance frameworks; while GitLab will support automated security testing and provide AI-based recommendations to free up resources."
Black Friday Kicks Off: How to Navigate the Latin American Market?
Nov 20, 2024 10:36 AM
Exploring Uncharted Territories in the Middle East: The Innovators Going Global
Nov 19, 2024 03:20 PM