{"id":181265,"date":"2025-07-17T15:00:00","date_gmt":"2025-07-17T15:00:00","guid":{"rendered":"https:\/\/entertainment.runfyers.com\/index.php\/2025\/07\/17\/confident-security-the-signal-for-ai-comes-out-of-stealth-with-4-2m-techcrunch\/"},"modified":"2025-07-17T15:00:00","modified_gmt":"2025-07-17T15:00:00","slug":"confident-security-the-signal-for-ai-comes-out-of-stealth-with-4-2m-techcrunch","status":"publish","type":"post","link":"https:\/\/entertainment.runfyers.com\/index.php\/2025\/07\/17\/confident-security-the-signal-for-ai-comes-out-of-stealth-with-4-2m-techcrunch\/","title":{"rendered":"Confident Security, \u2018the Signal for AI,\u2019 comes out of stealth with $4.2M | TechCrunch"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div>\n<p id=\"speakable-summary\" class=\"wp-block-paragraph\">As consumers, businesses, and governments flock to the promise of cheap, fast, and seemingly magical AI tools, one question keeps getting in the way: How do I keep my data private?<\/p>\n<p class=\"wp-block-paragraph\">Tech giants like OpenAI, Anthropic, xAI, Google, and others are quietly scooping up and retaining user data to improve their models or monitor for safety and security, even in some enterprise contexts where companies assume their information is off limits. For highly regulated industries or companies building on the frontier, that gray area could be a dealbreaker. Fears about where data goes, who can see it, and how it might be used are slowing AI adoption in sectors like healthcare, finance, and government.\u00a0<\/p>\n<p class=\"wp-block-paragraph\">Enter San Francisco-based startup <a rel=\"nofollow noopener\" href=\"https:\/\/confident.security\/\" target=\"_blank\">Confident Security<\/a>, which aims to be \u201cthe Signal for AI.\u201d The company\u2019s product, CONFSEC, is an end-to-end encryption tool that wraps around foundational models, guaranteeing that prompts and metadata can\u2019t be stored, seen, or used for AI training, even by the model provider or any third party.<\/p>\n<p class=\"wp-block-paragraph\">\u201cThe second that you give up your data to someone else, you\u2019ve essentially reduced your privacy,\u201d Jonathan Mortensen, founder and CEO of Confident Security, told TechCrunch. \u201cAnd our product\u2019s goal is to remove that trade-off.\u201d<\/p>\n<p class=\"wp-block-paragraph\">Confident Security came out of stealth on Thursday with $4.2 million in seed funding from Decibel, South Park Commons, Ex Ante, and Swyx, TechCrunch has exclusively learned. The company wants to serve as an intermediary vendor between AI vendors and their customers \u2013 like hyperscalers, governments, and enterprises.<\/p>\n<p class=\"wp-block-paragraph\">Even AI companies could see the value in offering Confident Security\u2019s tool to enterprise clients as a way to unlock that market, said Mortensen. He added that CONFSEC is also well-suited for new AI browsers hitting the market, like Perplexity\u2019s recently released <a href=\"https:\/\/techcrunch.com\/2025\/07\/09\/perplexity-launches-comet-an-ai-powered-web-browser\/\" target=\"_blank\" rel=\"noopener\">Comet<\/a>, to give customers guarantees that their sensitive data isn\u2019t being stored on a server somewhere that the company or bad actors could access, or that their work-related prompts aren\u2019t being used to \u201ctrain AI to do your job.\u201d<\/p>\n<p class=\"wp-block-paragraph\">CONFSEC is modeled after Apple\u2019s Private Cloud Compute (PCC) architecture, which Mortensen says \u201cis 10x better than anything out there in terms of guaranteeing that Apple cannot see your data\u201d when it runs certain AI tasks securely in the cloud.<\/p>\n<div class=\"wp-block-techcrunch-inline-cta\">\n<div class=\"inline-cta__wrapper\">\n<p>Techcrunch event<\/p>\n<div class=\"inline-cta__content\">\n<p>\n\t\t\t\t\t\t\t\t\t<span class=\"inline-cta__location\">San Francisco<\/span><br \/>\n\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"inline-cta__separator\">|<\/span><br \/>\n\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"inline-cta__date\">October 27-29, 2025<\/span>\n\t\t\t\t\t\t\t<\/p>\n<\/p><\/div>\n<\/p><\/div>\n<\/div>\n<p class=\"wp-block-paragraph\">Like Apple\u2019s PCC, Confident Security\u2019s system works by first anonymizing data by encrypting and routing it through services like Cloudflare or Fastly, so servers never see the original source or content. Next, it uses advanced encryption that only allows decryption under strict conditions.<\/p>\n<p class=\"wp-block-paragraph\">\u201cSo you can say you\u2019re only allowed to decrypt this if you are not going to log the data, and you\u2019re not going to use it for training, and you\u2019re not going to let anyone see it,\u201d Mortensen said.\u00a0<\/p>\n<p class=\"wp-block-paragraph\">Finally, the software running the AI inference is publicly logged and open to review so that experts can verify its guarantees.\u00a0<\/p>\n<p class=\"wp-block-paragraph\">\u201cConfident Security is ahead of the curve in recognizing that the future of AI depends on trust built into the infrastructure itself,\u201d Jess Leao, partner at Decibel, said in a statement. \u201cWithout solutions like this, many enterprises simply can\u2019t move forward with AI.\u201d<\/p>\n<p class=\"wp-block-paragraph\">It\u2019s still early days for the year-old company, but Mortensen said CONFSEC has been tested, externally audited, and is production-ready. The team is in talks with banks, browsers, and search engines, among other potential clients, to add CONFSEC to their infrastructure stacks.\u00a0<\/p>\n<p class=\"wp-block-paragraph\">\u201cYou bring the AI, we bring the privacy,\u201d said Mortensen.<\/p>\n<\/div>\n<p><br \/>\n<br \/><a href=\"https:\/\/techcrunch.com\/2025\/07\/17\/confident-security-the-signal-for-ai-comes-out-of-stealth-with-4-2m\/\" target=\"_blank\" rel=\"noopener\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>As consumers, businesses, and governments flock to the promise of cheap, fast, and seemingly magical AI tools, one question keeps getting in the way: How do I keep my data private? Tech giants like OpenAI, Anthropic, xAI, Google, and others are quietly scooping up and retaining user data to improve their models or monitor for [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":181266,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[14],"tags":[],"class_list":{"0":"post-181265","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-tech"},"_links":{"self":[{"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/posts\/181265","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/comments?post=181265"}],"version-history":[{"count":0,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/posts\/181265\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/media\/181266"}],"wp:attachment":[{"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/media?parent=181265"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/categories?post=181265"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/tags?post=181265"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}