Passwordless Auth to Servers: hands on with ASA

Rate this content
Bookmark

These days, you don't need a separate password for every website you log into. Yet thanks to tech debt and tradition, many DevOps professionals are still wrangling a host of SSH keys to access the servers where we sometimes need to be. With modern OAuth, a single login and second factor to prove your identity are enough to securely get you into every service that you're authorized to access. What if SSHing into servers was that easy? In this workshop, we'll use Okta's Advanced Server Access tool (formerly ScaleFT) to experience one way that the dream of sending SSH keys the way of the password has been realized.


- we'll discuss how ASA works and when it's the right tool for the job

- we'll walk through setting up a free trial Okta account to use ASA from, and configuring the ASA gateway and server on Linux servers

- we'll then SSH into our hosts with the ASA clients without needing to supply an SSH key from our laptops

- we'll review the audit logs of our SSH sessions to examine what commands were run

FAQ

Emily is a developer advocate at Okta with a background in DevOps and open source technology. She focuses on introducing and demonstrating tools like Okta's ASA, specializing in managing server access and security.

Okta's ASA (Advanced Server Access) tool is designed to manage server access securely by allowing one-time use keys for SSH and RDP access, which integrates with an identity provider for centralized management of authentication and authorization.

Authentication verifies if a user is the same individual as previously identified, often using credentials like passwords or biometrics. Authorization, on the other hand, determines if the authenticated user has permissions to perform specific actions or access certain resources.

Using a centralized identity provider for server access, such as in Okta's ASA, streamlines the authentication process, reduces the need to manage multiple keys, and enhances security by centralizing control over who can access what resources and when.

Setting up Okta's ASA involves creating an Okta account, integrating it with the ASA application, configuring user permissions and groups, and setting up server and client access through ASA's management interface.

The zero trust model in Okta's ASA rechecks a user's authentication and authorization on virtually every attempt to access resources. This frequent verification minimizes the risk of unauthorized access and enhances overall security.

Enrolling a server in Okta's ASA involves generating an enrollment token from the ASA interface, placing this token on the server, and installing the ASA agent which will register the server with the identity provider.

In Okta's ASA, changes to user permissions are dynamically updated, meaning if a user's permissions are revoked or altered, their access to servers and resources is immediately impacted, ensuring security is maintained in real-time.

E. Dunham
E. Dunham
32 min
29 Mar, 2022

Comments

Sign in or register to post your comment.

Video Summary and Transcription

This Workshop introduces Okta's ASA tool for managing server access and explores the concepts of authentication and authorization. It discusses the challenges of managing SSH keys and the benefits of using one-time use keys. The Workshop also explains how Okta's ASA simplifies server access by reusing an identity provider. It provides a step-by-step guide for setting up and managing authorization and authentication using Okta's ASA. Finally, it covers the process of setting up servers and clients, as well as the cleanup after testing.

1. Introduction to Okta's ASA

Short description:

Hi there, everybody. I am Emily to humans or eDunham to computers. I'm an open source nerd and assistant from back when we were just starting to call ourselves DevOps. I work as a developer advocate at Okta. And I will be showing you an Okta tool today, but my opinions and any errors I might make are entirely my own. A lot of thought leadership and advances in DevOps focus on getting us away from running and managing servers. But a lot of us still do have to run servers. Maybe it's because we're working on migrating out of an old architecture that was state of the art when it was first designed. Or maybe we're dealing with workloads that just genuinely require us to more directly manage our compute. Either way, when we manage our own cloud or hardware resources, we can still factor out individual tasks that would be better to do with purpose built software or services. Today, I'd like to introduce you to a trick for factoring out one of the hard parts of managing who can access what and when and show you how to replicate that trick. The tool that does this cool trick, Okta's ASA, will not be the perfect fit for all access use cases. However, even if you never end up working with ASA directly, I hope that understanding its context will help you think critically about the strengths and weaknesses of your existing tools like conventional SSH. Knowing more about your tools and their alternatives can help you make the best security choices for each operational challenge that you find yourself up against.

Hi there, everybody. I am Emily to humans or eDunham to computers. I'm an open source nerd and assistant from back when we were just starting to call ourselves DevOps. I work as a developer advocate at Okta. And I will be showing you an Okta tool today, but my opinions and any errors I might make are entirely my own.

We've got about two hours and a split audience. Some people are here in the Zoom room, and others are watching this recording in the future. We'll record the first half with all these explanations so that future people can join us for it. But for the hands on half, toward the end, I'll stop the recording so that we can just chat. And you can ask the questions that you might not want immortalized for posterity. You don't have to watch the screen as you listen to this. But it will help to see the screenshots once I get to the setup overview.

Now, a lot of thought leadership and advances in DevOps, focus on getting us away from running and managing servers. That's often great. You can offload individual tasks to people or entities that specialize in them. Your database is run by database experts, your network tooling is managed by networking experts. And that's awesome when it's something you can do. When you have the resources to offload domain-specific work onto specialists in those domains, they will usually do it better than your team could. Just like how when you're coding, if you can use a well tested library of cryptographic primitives, instead of trying to roll your own, you'll almost always end up with more secure code as a result.

But a lot of us still do have to run servers. Maybe it's because we're working on migrating out of an old architecture that was state of the art when it was first designed. Or maybe we're dealing with workloads that just genuinely require us to more directly manage our compute. Either way, when we manage our own cloud or hardware resources, we can still factor out individual tasks that would be better to do with purpose built software or services. Today, I'd like to introduce you to a trick for factoring out one of the hard parts of managing who can access what and when and show you how to replicate that trick.

Now, as a person who listens to talks and uses tools also, I'll start with that disclaimer that I'm always listening for. The tool that does this cool trick, Okta's ASA, will not be the perfect fit for all access use cases. The biggest blocker you might encounter is that the pricing is per server, and it's most tested with the assumption that you're using Okta as an identity provider. The identity provider bit is relatively negotiable, but sadly the pricing isn't. However, even if you never end up working with ASA directly, I hope that understanding its context will help you think critically about the strengths and weaknesses of your existing tools like conventional SSH. Knowing more about your tools and their alternatives can help you make the best security choices for each operational challenge that you find yourself up against.

2. Understanding Authentication and Authorization

Short description:

Since we're talking about security, let's clarify what auth means. It expands in two ways: authentication and authorization. Authentication verifies if someone is the same person as before, while authorization determines if the authenticated person is allowed to perform a specific action. No authentication method is perfect, but adding multiple factors makes it harder for attackers to impersonate us. Authentication alone is not enough for authorization. Just like having an ID does not guarantee entry into a bar, being authenticated does not guarantee authorization. Authorization can change over time. Understanding authentication and authorization is crucial when managing access to resources.

Since we're talking about security, I'd like to make sure we're all on the same page about what auth even means. This will be old news to some of you, but everyone hears it for the first time at some point and people come into DevOps from all different backgrounds. So, the trick to auth is that it expands two different ways. First, it might mean authentication. Authentication answers the question, is this the same person as before? For instance, a driver's license or passport is often used for authentication to get into places like bars or airplanes. The document with my face on it shows that I'm probably the same person that the government issued that ID to. My username, password and access to my phone are a form of authentication for many online accounts. If I have them, I'm probably the same person who created the account. My cat's microchip is also a form of authentication. If she ran off and someone scanned her chip to figure out how to get her home, it would tell them that she's the same cat that got put into that database when I first took her to the vet.

So, the disclaimer here is that no form of authentication is perfect. Anywhere that software interfaces with the rest of the world, there's room for unexpected things to happen. If someone knew all of my secrets and had access to all of my devices and could use my biometrics, they could impersonate me. If someone could convincingly make their face look just like mine, they could pretend to be me using my photo ID. To paraphrase a great author, as technology advances, the technology for fooling that technology grows alongside it. It keeps things interesting. But what we can do is make it prohibitively inconvenient for someone else to impersonate us. Every time we add another factor of authentication, we make it harder for an attacker to pretend successfully that they are us. To get into my work accounts, you'd need to know my password, and you'd need to have my phone, and you'd need my biometrics all together. It's easy-ish for a password to be compromised, especially if someone reuses it. It's easy-ish for a phone to get stolen or a SIM card to get hijacked. But the more factors you add together, the more things would have to go right for an attacker to defeat your authentication mechanisms. So, that's authentication. It answers the question, are you the same person as the fool?

Now, authentication is necessary but not sufficient for authorization. Just like being the same person on my ID isn't enough to guarantee that I can get into a bar, I also need to be authorized by being of age. Having a passport isn't enough to let you get onto a plane, you also need to be authorized for that flight by having a ticket. Authorization can change from day to day or even minute to minute. If I had a ticket for this flight yesterday, that doesn't mean I can get on the same flight tomorrow without a new ticket. If I'm logged into my Google account and someone shares a document with me, I wasn't authorized to see it a moment ago, and I am now. So, authorization answers the question, is this authenticated person allowed to do the thing they're trying to? What has this got to do with accessing servers? Well, if you've ever managed access to resources as an Ops person, you've reasoned about both authentication of users and authorization.

Watch more workshops on topic

0 to Auth in an hour with ReactJS
React Summit 2023React Summit 2023
56 min
0 to Auth in an hour with ReactJS
WorkshopFree
Kevin Gao
Kevin Gao
Passwordless authentication may seem complex, but it is simple to add it to any app using the right tool. There are multiple alternatives that are much better than passwords to identify and authenticate your users - including SSO, SAML, OAuth, Magic Links, One-Time Passwords, and Authenticator Apps.
While addressing security aspects and avoiding common pitfalls, we will enhance a full-stack JS application (Node.js backend + React frontend) to authenticate users with OAuth (social login) and One Time Passwords (email), including:- User authentication - Managing user interactions, returning session / refresh JWTs- Session management and validation - Storing the session securely for subsequent client requests, validating / refreshing sessions- Basic Authorization - extracting and validating claims from the session token JWT and handling authorization in backend flows
At the end of the workshop, we will also touch other approaches of authentication implementation with Descope - using frontend or backend SDKs.
Deploying React Native Apps in the Cloud
React Summit 2023React Summit 2023
88 min
Deploying React Native Apps in the Cloud
WorkshopFree
Cecelia Martinez
Cecelia Martinez
Deploying React Native apps manually on a local machine can be complex. The differences between Android and iOS require developers to use specific tools and processes for each platform, including hardware requirements for iOS. Manual deployments also make it difficult to manage signing credentials, environment configurations, track releases, and to collaborate as a team.
Appflow is the cloud mobile DevOps platform built by Ionic. Using a service like Appflow to build React Native apps not only provides access to powerful computing resources, it can simplify the deployment process by providing a centralized environment for managing and distributing your app to multiple platforms. This can save time and resources, enable collaboration, as well as improve the overall reliability and scalability of an app.
In this workshop, you’ll deploy a React Native application for delivery to Android and iOS test devices using Appflow. You’ll also learn the steps for publishing to Google Play and Apple App Stores. No previous experience with deploying native applications is required, and you’ll come away with a deeper understanding of the mobile deployment process and best practices for how to use a cloud mobile DevOps platform to ship quickly at scale.
MERN Stack Application Deployment in Kubernetes
DevOps.js Conf 2022DevOps.js Conf 2022
152 min
MERN Stack Application Deployment in Kubernetes
Workshop
Joel Lord
Joel Lord
Deploying and managing JavaScript applications in Kubernetes can get tricky. Especially when a database also has to be part of the deployment. MongoDB Atlas has made developers' lives much easier, however, how do you take a SaaS product and integrate it with your existing Kubernetes cluster? This is where the MongoDB Atlas Operator comes into play. In this workshop, the attendees will learn about how to create a MERN (MongoDB, Express, React, Node.js) application locally, and how to deploy everything into a Kubernetes cluster with the Atlas Operator.
Azure Static Web Apps (SWA) with Azure DevOps
DevOps.js Conf 2022DevOps.js Conf 2022
13 min
Azure Static Web Apps (SWA) with Azure DevOps
WorkshopFree
Juarez Barbosa Junior
Juarez Barbosa Junior
Azure Static Web Apps were launched earlier in 2021, and out of the box, they could integrate your existing repository and deploy your Static Web App from Azure DevOps. This workshop demonstrates how to publish an Azure Static Web App with Azure DevOps.
How to develop, build, and deploy Node.js microservices with Pulumi and Azure DevOps
DevOps.js Conf 2022DevOps.js Conf 2022
163 min
How to develop, build, and deploy Node.js microservices with Pulumi and Azure DevOps
Workshop
Alex Korzhikov
Andrew Reddikh
2 authors
The workshop gives a practical perspective of key principles needed to develop, build, and maintain a set of microservices in the Node.js stack. It covers specifics of creating isolated TypeScript services using the monorepo approach with lerna and yarn workspaces. The workshop includes an overview and a live exercise to create cloud environment with Pulumi framework and Azure services. The sessions fits the best developers who want to learn and practice build and deploy techniques using Azure stack and Pulumi for Node.js.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

Levelling up Monorepos with npm Workspaces
DevOps.js Conf 2022DevOps.js Conf 2022
33 min
Levelling up Monorepos with npm Workspaces
Top Content
Learn more about how to leverage the default features of npm workspaces to help you manage your monorepo project while also checking out some of the new npm cli features.
It's a Jungle Out There: What's Really Going on Inside Your Node_Modules Folder
Node Congress 2022Node Congress 2022
26 min
It's a Jungle Out There: What's Really Going on Inside Your Node_Modules Folder
Top Content
Do you know what’s really going on in your node_modules folder? Software supply chain attacks have exploded over the past 12 months and they’re only accelerating in 2022 and beyond. We’ll dive into examples of recent supply chain attacks and what concrete steps you can take to protect your team from this emerging threat.
You can check the slides for Feross' talk here.
Automating All the Code & Testing Things with GitHub Actions
React Advanced Conference 2021React Advanced Conference 2021
19 min
Automating All the Code & Testing Things with GitHub Actions
Top Content
Code tasks like linting and testing are critical pieces of a developer’s workflow that help keep us sane like preventing syntax or style issues and hardening our core business logic. We’ll talk about how we can use GitHub Actions to automate these tasks and help keep our projects running smoothly.
Fine-tuning DevOps for People over Perfection
DevOps.js Conf 2022DevOps.js Conf 2022
33 min
Fine-tuning DevOps for People over Perfection
Top Content
Demand for DevOps has increased in recent years as more organizations adopt cloud native technologies. Complexity has also increased and a "zero to hero" mentality leaves many people chasing perfection and FOMO. This session focusses instead on why maybe we shouldn't adopt a technology practice and how sometimes teams can achieve the same results prioritizing people over ops automation & controls. Let's look at amounts of and fine-tuning everything as code, pull requests, DevSecOps, Monitoring and more to prioritize developer well-being over optimization perfection. It can be a valid decision to deploy less and sleep better. And finally we'll examine how manual practice and discipline can be the key to superb products and experiences.
Why is CI so Damn Slow?
DevOps.js Conf 2022DevOps.js Conf 2022
27 min
Why is CI so Damn Slow?
We've all asked ourselves this while waiting an eternity for our CI job to finish. Slow CI not only wrecks developer productivity breaking our focus, it costs money in cloud computing fees, and wastes enormous amounts of electricity. Let’s take a dive into why this is the case and how we can solve it with better, faster tools.
The State of Passwordless Auth on the Web
JSNation 2023JSNation 2023
30 min
The State of Passwordless Auth on the Web
Can we get rid of passwords yet? They make for a poor user experience and users are notoriously bad with them. The advent of WebAuthn has brought a passwordless world closer, but where do we really stand?
In this talk we'll explore the current user experience of WebAuthn and the requirements a user has to fulfill for them to authenticate without a password. We'll also explore the fallbacks and safeguards we can use to make the password experience better and more secure. By the end of the session you'll have a vision for how authentication could look in the future and a blueprint for how to build the best auth experience today.