The state of cloud-native technology and AI

Dec 13th, 2023

Noa Shavit avatar

Noa Shavit

Developers

The state of authorization and AI

A couple of weeks ago Aserto CEO, Omri Gazitt, sat down with Dustin Kirkland from Chainguard, David Aronchick from Expanso and John Furrier from SiliconANGLE to chat about emerging cloud-native technology and the role of AI.

Watch this video for more or read the summarized transcription below.

John Furrier:

Now that Kubernetes is starting to go mainstream, what is the most interesting innovation relative to this next level of enablement?

Omri Gazitt:

I’m excited about building the next layer on top of core infrastructure. Identity and access are next in line, and access specifically is what we focus on. It’s a classic example of something that every application builds on its own. And I'm excited to help developers stop building access control, which takes up 20% of the logic of the average application.

Authorization is boilerplate logic. It's manual and error prone. It's mind-numbing and soul crushing work. But if you get it wrong, you can have serious problems. Unfortunately, 94% of the applications tested by OWASP exhibit some form of broken access control. We’re excited about making fine-grained authorization available as a standard developer API.

It looks like authorization is finally getting its day in the sun. I've seen more awareness now of the fact that you can externalize authorization than ever before. We’ve also been able to establish an OpenID Foundation working group focused on authorization standards, called AuthZEN. These standards will help us create an interoperable authorization fabric, just as OAuth2 and OIDC have done for authentication.

David Aronchick:

I think we are seeing the focus shift back to the application. People are thinking about their application layer and data layer in a much more thoughtful way.

Kubernetes will be an incredible innovator, but organizations that want to succeed and take advantage of all these models will need to have a data plan. They’ll need to think about how to schedule against data and move it at that application layer.

John Furrier:

That’s interesting. The scale and the data tsunami coming in is just unprecedented. It's a whole different game.

Dustin Kirkland:

I'm curious about talking about the expense around AI, such as the NVIDIA H100 GPUs and other hardware. Do you think there's a world that emerges where there are haves and have nots when it comes to AI, ML inferencing, and smarter decision making?

Omri Gazitt:

I think that the big players are driving down costs and democratizing the technology. At the same time, I think that the level of scale that they have is unmatchable by small companies. So, if you are a startup that's trying to innovate at the AI infrastructure level, you’ll have to find a niche. There are so many applications, large language models (LLMs) are all the rage, but there are many others.

I'm excited about conventional machine learning, where you run anomaly detection on a large set of access logs, for example. This will let you detect unanticipated and often unwanted behavior. You might find that a user in the engineering group signed a purchase order, when they really shouldn’t have the authority to do that. This is a classic application of machine learning. It doesn't require special hardware, you don't have to buy an H100.

John Furrier:

There's an old expression in the Silicon Valley entrepreneurial world: “That's a feature, not a company.” Today you can actually make a company out of a feature. You can pick a vertical, go deep, and be very successful.

David Aronchick:

Absolutely. I think there are two players in the world. Those that sell, what I jokingly refer to as “specialized electricity,” and those that don't. In “specialized electricity” I mean storage, compute, and networking. I think there's going to be at most half a dozen companies that will sell “specialized electricity.” There's plenty of room in business for those who won’t.

LLMs have transformed the way we can store knowledge and access it. We can share a ridiculous amount of structured, unstructured, and semi structured data. Specify how we want to store it and then ask arbitrary questions against that data. Was that available before? Sure, but you had to build a very deep structure and be a subject matter expert.

Today we have a way that we can interact with that body of knowledge in a straightforward way and get an output. That output is not going to be a company, but it can be a quick and easy prototype of an idea.

John Furrier:

I want to get your thoughts on this idea of platforms. Are there too many platforms? What is a platform in this model when you have hyper scale layers?

I'm starting to see companies have platforms without buying hardware. Is platform just another word for a digital distributed system an enterprise has to run?

Omri Gazitt:

I'm seeing more and more enterprises with platform services teams. These teams assemble a set of capabilities that application developers need. They’re trying to build a standard platform and create rails for application developers, so that the app developers don't have to cobble together these capabilities on their own or reinvent the wheel.

John Furrier:

Are they building their own platforms, or are they using vendors to help them?

Omri Gazitt:

Wherever possible, they're trying to use mature open-source and vendor-supported technology, but sometimes they still need to build their own. They are trying to create their own specialized platform, using the technology that's available to them.

John Furrier:

If you were 25 again, what would you be doing?

Omri Gazitt:

When I was studying computer science, I was excited about going from assembler to high level languages and building compilers. Today, the tools are vastly different. We can envision an application, describe it to a computer, and the computer will generate a pretty good first draft of that - it's mind blowing! The level of capability that developers now have is staggering.

I think that today's university grads would be well served to use all the amazing tech that is available to them. Every student should use the latest and greatest tools, because that will make them so much more productive.

David Aronchick:

I couldn't agree more - it's productivity. 100% of my unit testing has been pre-written by my Copilot, for example.

AI can also help with projects we do not want to fully automate. As Omri said, we now have an opportunity to interact with these models in new and interesting ways. They're able to pull in information that is not only valuable, but you might not even have known was out there, allowing you to build on top of that.

Conclusion

In this post, we delved into the evolving landscape of AI applications, from performing anomaly detection on large datasets of logs to creating first-prototypes of ideas described to them. We also reviewed the latest market trends, including a rise in awareness of externalized authorization, an increase in enterprises with platform teams, and higher emphasis on data at the application layer. We concluded the conversation with the pivotal role AI has on productivity.

Cloud-native authorization is an emerging market that aims to help application developers stop building permissions. Aserto offers a fine-grained authorization service that is flexible enough to support any logic and easy enough to be implemented by a single developer and production ready in 1-2 weeks. If you’d like to learn more drop us a line, or join our community Slack.

Noa Shavit avatar

Noa Shavit

Head of Marketing