Quantcast
Channel: Blog – Stormpath User Identity API
Viewing all 278 articles
Browse latest View live

Build a CRUD Application with React, Spring Boot, and User Authentication

$
0
0

React is one of the most popular libraries for creating web application frontends. With Spring Boot it’s easier than ever to create a CRUD backend for your React-fronted application. In this tutorial, we’ll tie those together and then use Stormpath to add authentication and authorization protocols.

We’ll start by creating a static data view using React. Then we will create a REST backend with Spring Boot, tie it in, and add user security with Stormpath. Everything should be straight-forward, even if you’ve never used React before.

The source code that backs this post can be found in this GitHub repo.

Serving the Front

Normally React applications are served up using Node.js, but if you’re a Java dev you’ll likely be super comfortable with this Spring Boot approach.

Initially, you’ll put the whole application in one file, index.html. To tell Spring Boot to serve it as the homepage you can use the @Controller annotation.

@Controller
public class HomeController {

    @RequestMapping(value = "/")
    public String index() {
        return "index.html";
    }
}

Create an empty directory and put the above code into src/main/java/tutorial/HomeController.java. Spring Boot will then look for src/main/resources/static/index.html when you load the site.

<!DOCTYPE html>
<html>
<head>
    <title>React + Spring</title>
</head>
<body>
</body>
</html>

Create a pom.xml and a Spring Boot application class. Use the following for your POM.

<project>
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.example</groupId>
    <artifactId>demo</artifactId>
    <version>0.0.1-SNAPSHOT</version>

    <parent>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-parent</artifactId>
        <version>1.4.1.RELEASE</version>
    </parent>

    <dependencies>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
        </dependency>
    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-maven-plugin</artifactId>
            </plugin>
        </plugins>
    </build>
</project>

Put the following into src/main/java/tutorial/Application.java.

@SpringBootApplication
public class Application {

    public static void main(String[] args) {
        SpringApplication.run(Application.class, args);
    }
}

When you start the server with mvn spring-boot:run and visit localhost:8080 you should see a blank page with the title “React + Spring”.

React + Spring

Remove Restarts

Normally you’d have to restart your server each time you make a change to your front-end which is a pain. Using Spring Boot’s developer tools allows us to get around this. Add the following dependency to your POM.

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-devtools</artifactId>
    <optional>true</optional>
</dependency>

Also add this configuration to your Spring Boot Maven Plugin:

<configuration>
    <addResources>true</addResources>
</configuration>

Now, when you make a change to your application, or recompile any classes, it should update when you refresh your browser.

React Barebones HTML

On to React! The most basic React page has three things: a root element, JavaScript imports, and a script tag.

<!DOCTYPE html>
<html>
<head>
    <title>React + Spring</title>
</head>
<body>
    <div id='root'></div>

    <script src="https://fb.me/react-15.0.1.js"></script>
    <script src="https://fb.me/react-dom-15.0.1.js"></script>
    <script src="https://cdnjs.cloudflare.com/ajax/libs/babel-core/5.8.23/browser.min.js"></script>

    <script type="text/babel"></script>
</body>
</html>

The root element is where React will insert our view HTML. The imports pull in three libraries—two for React itself and another to translate our view template using babel.

Note: for ease of testing we are pulling in libraries using a CDN but normally you would use something like webpack to combine all your Javascript into one file.

Now put your React code inside the script tag.

React Basics

As described in the thinking in react tutorial, you should start coding your application by breaking the interface down into components.

<script type="text/babel">
var Employee = React.createClass({});
var EmployeeTable = React.createClass({});
</script>

Here you’ve created two—one for a table of employees and another for an employee entry. Each component then needs a render function which describes the HTML to generate.

<script type="text/babel">
var Employee = React.createClass({
  render: function() {
    return (<div>employee</div>);
  }
});
var EmployeeTable = React.createClass({
  render: function() {
    return (<div>employee table</div>);
  }
});
</script>

Here’s where the Babel compiler comes in to convert HTML code into the correct React statements. Note how the div tags are returned from the render statement.

You need to tell React to insert the parent component’s HTML into the root element. This is done using the ReactDOM.render method.

<script type="text/babel">
var Employee = React.createClass({
  render: function() {
    return (<div>employee</div>);
  }
});
var EmployeeTable = React.createClass({
  render: function() {
    return (<div>employee table</div>);
  }
});

ReactDOM.render(
  <EmployeeTable />, document.getElementById('root')
);
</script>

By refreshing the browser you should see the simple text element you created.

React + Spring employee table

To see the HTML React inserted into the root element you can use the browser’s inspector (Ctrl-Shift-J in Chrome).

Browser inspector

Tying Components Together

Now that you have components, let’s tie them together. You can start by trying to render hard-coded data; you’ll use the REST server later.

Above the ReactDOM command enter the following:

var EMPLOYEES = [
  {name: 'Joe Biden', age: 45, years: 5},
  {name: 'President Obama', age: 54, years: 8},
  {name: 'Crystal Mac', age: 34, years: 12},
  {name: 'James Henry', age: 33, years: 2}
];

Then add employees={EMPLOYEES} when you instantiate your table.

ReactDOM.render(
  <EmployeeTable employees={EMPLOYEES} />, document.getElementById('root')
);

As you might expect, this passes the data into a variable named employees. Inside EmployeeTable you can access this using this.props. Let’s use that to generate a table with a row for each employee.

var EmployeeTable = React.createClass({
  render: function() {
    var rows = [];
    this.props.employees.forEach(function(employee) {
      rows.push(<Employee employee={employee} />);
    });
    return (
      <table>
        <thead>
          <tr>
            <th>Name</th><th>Age</th><th>Years</th>
          </tr>
        </thead>
        <tbody>{rows}</tbody>
      </table>);
  }
});

This instantiates a new Employee class for each element in the data (setting the employee attribute) and pushes it to an array. Then {rows} drops in the required HTML from the child class.

Now all you need do is update the render method on Employee.

var Employee = React.createClass({
  render: function() {
    return (
      <tr>
        <td>{this.props.employee.name}</td>
        <td>{this.props.employee.age}</td>
        <td>{this.props.employee.years}</td>
      </tr>);
  }
});

You can add Bootstrap to make the table look nice. Add the following just below your script import tags:

<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css">

Then surround your main table with a container div and give the table element some Bootstrap class names.

<div className="container">
  <table className="table table-striped">
    <thead>
      <tr>
        <th>Name</th>
        <th>Age</th>
        <th>Years</th>
      </tr>
    </thead>
    <tbody>{rows}</tbody>
  </table>
</div>

Refreshing your browser should give a nice view of the data you hard-coded!

86263098

Adding Real Data

To use data objects coming from the server, you need to add a server! Doing this with Spring Boot is super simple. Inside src/main/java/tutorial/Employee.java add the following code:

@Data
@Entity
public class Employee {

    private @Id @GeneratedValue Long id;
    private String name;
    private int age;
    private int years;

    private Employee() {}

    public Employee(String name, int age, int years) {
        this.name = name;
        this.age = age;
        this.years = years;
    }
}

This is our bean. Note: the @Data annotation is from Project Lombok.

Now, create a repository using Spring Data JPA.

public interface EmployeeRepository extends CrudRepository<Employee, Long> {}

To load data, create a CommandLineRunner implementation that uses the repository to create new records in the database.

@Component
public class DatabaseLoader implements CommandLineRunner {

    private final EmployeeRepository repository;

    @Autowired
    public DatabaseLoader(EmployeeRepository repository) {
        this.repository = repository;
    }

    @Override
    public void run(String... strings) throws Exception {
        this.repository.save(new Employee("Joe Biden", 45, 5));
        this.repository.save(new Employee("President Obama", 54, 8));
        this.repository.save(new Employee("Crystal Mac", 34, 12));
        this.repository.save(new Employee("James Henry", 33, 2));
    }
}

The only thing left is pulling in dependencies. Adding the following to your pom.xml will allow your repository to become a REST endpoint.

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-data-rest</artifactId>
</dependency>
<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>

You’ll also need to include Project Lombok (which lets you ignore creating getters and setters for your beans).

<dependency>
    <groupId>org.projectlombok</groupId>
    <artifactId>lombok</artifactId>
    <version>1.16.10</version>
    <scope>provided</scope>
</dependency>

And you need a database (which Spring Boot autoconfigures). You can use H2, which is embedded (i.e. in memory / won’t last a reboot).

<dependency>
    <groupId>com.h2database</groupId>
    <artifactId>h2</artifactId>
</dependency>

And that’s it! If you reboot now you’ll have a functioning REST server with data.

Mapping the URL

If you add the following to src/main/resources/application.properties then all your REST endpoint calls will be at localhost:8080/api

spring.data.rest.basePath=/api

Calling localhost:8080/api/employees from the command line should give a list of the data you loaded.

React and REST

Now you need to pull the data into your React view from the REST endpoint. You can do this with jQuery. Add the following import to your HTML:

<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.1.1/jquery.min.js"></script>

Now create a wrapper class that returns an EmployeeTable in its render method.

var App = React.createClass({

  loadEmployeesFromServer: function () {
    var self = this;
    $.ajax({
      url: "http://localhost:8080/api/employees"
    }).then(function (data) {
      self.setState({employees: data._embedded.employees});
    });
  },

  getInitialState: function () {
    return {employees: []};
  },

  componentDidMount: function () {
    this.loadEmployeesFromServer();
  },

  render() {
    return ( <EmployeeTable employees={this.state.employees}/> );
  }
});

You have to set state first by using getInitialState to initialise, and then componentDidMount to do what’s needed when everything is loaded.

Now replace the main ReactDOM.render with your new class.

ReactDOM.render(<App />, document.getElementById('root') );

On refresh you should see the same view as before, except now the data is being loaded from the server.

Interactivity

The last thing you’ll want for your frontend is interactivity. Let’s add a delete button to see how that might work.

Add the following column to your employee render.

<td>
    <button className="btn btn-info" onClick={this.handleDelete}>Delete</button>
</td>

You’ll write the handleDelete method in a sec. After adding another heading to the employee table class, you should see buttons appear alongside each entry.

React + Spring Boot Interactivity

Deleting from the Server

Before you send delete requests to the backend, it’s a good idea to add notification messages. For that you can use Toastr which will allow you to show popups. Include the following at the top of your HTML:

<script src="https://cdnjs.cloudflare.com/ajax/libs/toastr.js/2.1.3/toastr.min.js"></script>
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/toastr.js/2.1.3/toastr.min.css">

Now in your script you can send messages with commands like toastr.error('something went wrong').

Let’s test that! Change your employee class to the following:

var Employee = React.createClass({
  getInitialState: function() {
    return {display: true };
  },
  handleDelete() {
    var self = this;
    $.ajax({
      url: self.props.employee._links.self.href,
      type: 'DELETE',
      success: function(result) {
        self.setState({display: false});
      },
      error: function(xhr, ajaxOptions, thrownError) {
        toastr.error(xhr.responseJSON.message);
      }
    });
  },
  render: function() {
    if (this.state.display==false) return null;
    else return (
      <tr>
        <td>{this.props.employee.name}</td>
        <td>{this.props.employee.age}</td>
        <td>{this.props.employee.years}</td>
        <td>
          <button className="btn btn-info" onClick={this.handleDelete}>Delete</button>
        </td>
      </tr>
    );
  }
});

This sets a display state which determines whether to render or not. If the employee is deleted successfully, this variable is set to true. The handleDelete method sends a delete request to the server (using the href you got back from the get request). If successful, display is set to false and the render is updated. Otherwise, Toastr notifies the user that an error occurred.

Try deleting an entry and refreshing the page. It should stay deleted.

Note: Restarting the server will bring back the same data since you’re using an in-memory database.

Add User Authentication

Let’s add one final feature to our React application, Stormpath for user authentication. You’ll need a forever-free developer account with Stormpath.

The first thing you need do is put your Stormpath application details inside of your application.properties.

stormpath.application.href = <your app href>
stormpath.client.apiKey.id = <your api key id>
stormpath.client.apiKey.secret = <your api key secret>

Note: For security reasons you should not store your Stormpath keys inside of project files. Rather use environment variables. See here.

Next, add the Stormpath starter to your Maven dependencies.

<dependency>
    <groupId>com.stormpath.spring</groupId>
    <artifactId>stormpath-default-spring-boot-starter</artifactId>
    <version>1.1.2</version>
</dependency>

You’ll also need to move your index.html file to src/main/resources/templates This is because Stormpath’s Spring Boot starter uses the Thymeleaf templating library by default. Change the HomeController to return index as well.

@Controller
public class HomeController {

    @RequestMapping(value = "/")
    public String index() {
        return "index";
    }
}

You’ll also need to move your React code into a separate file. This is because Thymeleaf won’t like some of the characters. Move the code from the inside of the script tag into src/main/webapp/public/app.js. This folder is open to the public by default. Then import this script at the bottom of your HTML.

<script type="text/babel" src="/public/app.js"></script>

Then create a security adapter which calls apply on stormpath().

@Configuration
public class Security extends WebSecurityConfigurerAdapter {
    @Override
    protected void configure(HttpSecurity http) throws Exception {
        http.apply(stormpath());
    }
}

Now when you reboot your server and try reach the homepage you’ll be prompted with a login page.

Login or create an account

Typing in the details attached to the Stormpath application you put in application.properties should take you to the data view page as before.

Logging Out

You need to be able to log out as well. This is as easy as adding a form that sends a post to /logout (which Stormpath sets up by default).

<div class='container'>
    <div id='root'></div>
    <form action="/logout" method="post">
        <input class="btn btn-danger center-block" type="submit" value="Logout" />
    </form>
</div>

You can surround both the React root element and the form with a Bootstrap container for better alignment.

3245088

Clicking the logout button should take you back to the login screen as before.

Set up Authorization

Lastly, you’ll want to only let users with the correct access delete employees. To lock things down, you can use Spring Security’s PreAuthorize annotation. Change the repository code to the following:

public interface EmployeeRepository extends CrudRepository<Employee, Long> {

    @PreAuthorize("hasAuthority('ROLE_ADMIN')")
    @Override
    void delete(Long aLong);
}

Now only users with the authority ROLE_ADMIN will be able to delete. If you restart your server and try to click delete you should get a message saying “Access is denied”.

36408723

To give a user the required rights, you need to add them to a Stormpath group via the Admin Console.

73134913

In this example, there is a group called Supervisor that’s been attached to the relevant application. To integrate with this group, you simply need to replace the ROLE_ADMIN string with the HREF of the group and restart. If the user that’s logged in is a member of the Supervisor group (see Accounts), you should be allowed to delete.

Done and Dusted

Just like that you have created a compliant CRUD web application with authorization and React as the front-end. I hope you found this tutorial useful! If you have any questions about integrating React, Spring Boot and Stormpath, please leave a comment.

To see a more complete React application using a Spring Boot backend see React.js and Spring Data REST.

The post Build a CRUD Application with React, Spring Boot, and User Authentication appeared first on Stormpath User Identity API.


Tutorial: User Authentication in AngularJS and ASP.NET Core

$
0
0

Modern web applications have an explicit separation between the server and the client. Clients use AngularJS, ReactJS, EmberJS, and others. Servers use NodeJS, Java, and, .NET. Microsoft’s .NET platform is a strong, battle-proven server-side framework, and AngularJS is arguably the most popular client-side framework. They work seamlessly together, but adding solid, secure authentication can seem daunting. Leveraging Stormpath makes it easier than you’d think!

Get the Base Application

If you want to follow along with this article, you can clone the master branch from Github or download it from here. If you just want to see the finished project, you can clone the finished branch, or download the finished app from here.

1. Get Your Stormpath Credentials

Login to your Stormpath Account and there is a link to Manage API Keys under Developer Tools on the right-hand side of the admin home page (If this is a new Stormpath account, the link will say “Create API Key”). At the bottom of the page there is an API Keys section, just click the “Create API Key” button. After confirming with the modal dialog, a file named apiKey-[APIKEYID].properties should automatically be downloaded to your computer. Hang on to that, and we’ll use it later.

2. Add Stormpath to Your ASP.NET Core WebAPI

Visual Studio 2015
Open the Visual Studio Package Manager console and enter the following command:

PM> Install-Package Stormpath.AspNetCore

Visual Studio Code
If you aren’t using Visual Studio, you can also edit project.json file add the following line to the dependencies section:

"Stormpath.AspNetCore": "*"

Then run dotnet restore (Chances are, VS Code will prompt you with a “Restore” button at the top of the IDE).

Adding the Stormpath Middleware

Open the Startup.cs file in the root of your project. This file is required for ASP.NET Core applications and is similar to Global.asax file in ASP.NET. At the top of the file, add this using statement:

using Stormpath.AspNetCore;

Then, edit the ConfigureServices and Configure methods as follows:

public void ConfigureServices(IServiceCollection services)
{
     services.AddStormpath();


      // Add framework services.
      services.AddMvc();

} 

public void Configure(IApplicationBuilder app, IHostingEnvironment env)
{
    // This is required to serve static files like .html or .js 
    app.UseStaticFiles();

    // make sure this line comes after app.UseStaticFiles();
    app.UseDefaultFiles(); // serves up our wwwroot files
    app.UseStaticFiles(); // allows serving of static files
    app.UseStormpath();
    app.UseMvc(); // sets MVC routes for webapi
}

3. Add Your Stormpath Credentials to Your Application

Now we’ll install our Stormpath credentials into the application. Create a file in the root of the application and name it stormpath.yaml. Use the API key and secret contained in the apiKey.properties file you downloaded, as well as the Stormpath Application href to fill in the values in the YAML file:

---
application:
  href: "https://api.stormpath.com/v1/applications/<application_id>"
client:
  apiKey:
    id: "<id_found_in_file>"
    secret: "<secret_found_in_file>"

Configuration via a YAML file is simple and straightforward, but it’s important to not check this file into public source control, as it would expose your API key and secret. For production, Stormpath strongly recommends you to store these values as environment variables, STORMPATH_APPLICATION_HREF, STORMPATH_CLIENT_APIKEY_ID and STORMPATH_CLIENT_APIKEY_SECRET respectively. Read the documentation on Environment Variables here.

4. Add Stormpath to the AngularJS App

You can use your package manager of choice (Bower or NPM) for client-side dependencies. For simplicity’s sake, we’re going to get the reference directly from Github using Rawgit.

Add a link reference in your index.html file in the root of your client app to the Stormpath Angular SDK and templates files.

<script src="//cdn.rawgit.com/stormpath/stormpath-sdk-angularjs/1.0.0/dist/stormpath-sdk-angularjs.min.js"></script>
<script src="//cdn.rawgit.com/stormpath/stormpath-sdk-angularjs/1.0.0/dist/stormpath-sdk-angularjs.tpls.min.js"></script>

Then add the references to the angular app module:

angular.module('ToDoApp', ['ngCookies', 'ngResource', 'ngSanitize', 'ui.router', 'stormpath', 'stormpath.templates'])

5. Add Login and Register to the Angular App

Next, we’ll add some menu items in the index.html file:

<li><a ui-sref="login" if-not-user>Login</a></li>
<li><a ui-sref="register" if-not-user>Register</a></li>
<li><a ui-sref="home" sp-logout if-user>Logout</a></li>

And the routes that go with them:

.state('register', {
  url: '/register',
  templateUrl: '/app/auth/views/register.view.html'
})
.state('login', {
  url: '/login',
  templateUrl: '/app/auth/views/login.view.html'
})
.state('todo', {
  url: '/todo',
  templateUrl: '/app/todo/views/todo.view.html',
  sp: {
     authenticate: true
   }
})

We’ve added a login and register route to the application’s router, and we’ve updated the todo route to only be accessible by logged-in users.

The “Logout” route is completely handled by Stormpath, but we’ll need some handlers for the “Login” and “Register” routes. I’ve created a folder called auth to put these controllers in. First, the controllers:

login.controller.js

(function(){
  function loginController(){}

  angular.module('ToDoApp')
    .controller('LoginController', [loginController]);
}())

register.controller.js

(function(){
  function registerController(){}

  angular.module('ToDoApp')
    .controller('RegisterController', [registerController]);
}())

Also, don’t forget to at the script references to your index.html page:

<script src="app/auth/controllers/login.controller.js"></script>
<script src="app/auth/controllers/register.controller.js"></script>

You’ll notice, both controllers are essentially empty. That’s because the real magic happens in the views:

login.view.html

<section ng-controller="LoginController">
  <div sp-login-form></div>
</section>

register.view.html

<section ng-controller="RegisterController">
  <div sp-registration-form post-login-state="todo"></div>
</section>

We used a few directives here that come with the Stormpath Angular SDK. The sp-login-form and sp-registration-form create their respective forms, and the post-login-state tells the application where to go once the registration process is done and the application logs us in (assuming auto-login is enabled in the Stormpath Management UI).

6. Configure the Angular App

There are some things that need to be configured to make the Angular/.NET/Stormpath combination work seamlessly. First, we need to set up the routing hash and the configuration for the registration form’s POST behavior:
(Don’t forget to inject the $locationProvider and STORMPATH_CONFIG into the config function for the application)

$locationProvider.html5Mode(true);
STORMPATH_CONFIG.FORM_CONTENT_TYPE = 'application/json';

For html5Mode to work in Angular, you must specify the base URL in the index.html file:

<head>
  <meta charset="utf-8">
  <base href="/"> <!-- This tells the application what the base url is for the app -->

This turns on HTML5-type routing for the Angular application, and the STORMPATH_CONFIG line tells the Angular SDK to POST the registration form as application/json. By default, the Angular SDK sends the registration information as application/x-www-form-urlencoded, but the ASP.NET SDK expects JSON to be posted. This line just makes sure they’re on the same page as far as Content-Type headers.

Lastly, we’ll configure how the application behaves after login and after logout:

.run(['$stormpath', '$rootScope', '$state', initializer]);

function initializer($stormpath, $rootScope, $state) {
  // Finally, configure the login state and the default state after login
  $stormpath.uiRouter({
    loginState: 'login',
      defaultPostLoginState: 'todo'
    });

  // Bind the logout event
  $rootScope.$on('$sessionEnd', function () {
    $state.transitionTo('login');
  });
}

This code lets the application know that we want users to be routed to the todo route once someone has successfully logged in and that we want them to be routed to the login routed when they log out.

7. Adding Authorization to the Server Side

Now that we have the client side authenticating users, we need to make sure the server side only returns ToDos for the currently logged in user. First, we need to make sure that when the user requests their ToDos, they’re authenticated by adding the Authorize attribute to the TodoController:

[Authorize]
[Route("api/[controller]")]
public class TodosController : Controller
{
  // implementation ...
}

Then we just make sure the queries filter by the currently logged in user and attach the current user to our ToDos when they’re being added.

[HttpGet]
public IActionResult Get()
{
    return Ok(_context.Todos.Where(x=>x.User == User.Identity.Name).ToList());
}


[HttpGet("{id}", Name = "GetTodo")]
public IActionResult GetById(int id)
{
    var todo = _context.Todos.SingleOrDefault(t => t.User == User.Identity.Name && t.Id == id);
    if(todo == null)
    {
        return NotFound($"No todo with an Id of {id} was found.");
    }
    return Ok(todo);
}

public IActionResult Post([FromBody] Todo todo)
{
    if (string.IsNullOrEmpty(todo.Description))
    {
        return BadRequest("There must be a description in the todo.");
    }
    todo.User = User.Identity.Name;
    _context.Entry(todo).State = todo.Id > 0 ? EntityState.Modified : EntityState.Added;
    _context.SaveChanges();
    return CreatedAtRoute("GetTodo", new { id = todo.Id }, todo);
}

[HttpDelete("{id}")]
public IActionResult Delete(int id)
{
    var todo = _context.Todos.FirstOrDefault(t => t.User == User.Identity.Name && t.Id == id);
    if (todo == null)
    {
        return NotFound($"No todo with an Id of {id} was found for the current user.");
    }


    _context.Todos.Remove(todo);
    _context.SaveChanges();
    return Ok();
}

That’s it! When you fire up the application and try to navigate to the todo page, you should be redirected to the login route, and it should look something like this:

screen-shot-2016-10-31-at-11-59-03-am

You should now only see ToDos for the currently logged in user, and when you add a ToDo, it should be saved as a ToDo for that user!


Excited to learn more about ASP.NET Core, or user authentication with Stormpath? Check out these resources:

  • Watch: Token Authentication with ASP.NET Core
  • Simple Social Login with ASP.NET Core
  • Tutorial: Build an ASP.NET Core Application with User Authentication
  • And as always, hit me up in the comments below, or on Twitter @leebrandt with questions!

    The post Tutorial: User Authentication in AngularJS and ASP.NET Core appeared first on Stormpath User Identity API.

    Optimize Your React Application with Webpack in 15 Minutes

    $
    0
    0

    Looking to optimize the performance of your Spring Boot Application? (Who isn’t?) Sure, you could pull JavaScript and CSS files from a CDN, but for a real performance upgrade you probably want to try bundling your assets with webpack! In a previous tutorial you created a web application using React and Bootstrap that pulled libraries into HTML manually. In this post, you’ll learn how to use webpack to organize your imports more effectively.

    Start a New React Application with Spring Boot

    Start by cloning the previous application and making sure you have Stormpath configured with a ~/.stormpath/apiKey.properties file. It should then boot with Maven.

    mvn spring-boot:run

    Open your browser and navigate to http://localhost:8080. This presents you with a login screen, and once you’re logged in a styled data table.

    React + Spring Boot

    Convert HTML

    Now, strip out the external JavaScript and CSS imports in src/main/resources/templates/index.html. Also remove the Babel specifier in your local script and change app.js to bundle.js.

    
    
        React + Spring
    
    
    

    Move your app.js from src/main/webapp/public to src.

    Webpack Configuration

    Configure webpack by creating a file called webpack.config.js in the root directory. It defines a Node-style export module.

    module.exports = {
      entry: './src/app.js',
      output: {
        path: __dirname + '/src/main/webapp/public', 
        filename: 'bundle.js' 
      }
    };

    entry is the starting point of your app or component. It is the main method of your code and is where webpack builds a dependency graph from.

    output is where the bundled assets should end up. You put it into src/main/webapp/public because that is where Spring Boot serves out static files to the public.

    Run Webpack from the Command Line

    With the configuration file complete, all you need to do is run webpack from the command line.

    Note: if you haven’t done so already install webpack globally with npm install -g webpack.

    Webpack from the Command Line

    You’ll see an error that mentions ‘Unexpected token’. This is because your script is written in ES6 and you need to convert it to common Javascript. You do this with loaders in your webpack.config.js.

    Use Loaders

    Loaders process assets before being bundled. You include them in your webpack.config.js.

    module.exports = {
      entry: './src/app.js',
      output: {
        path: __dirname + '/src/main/webapp/public', 
        filename: 'bundle.js' 
      },
      module: {
        loaders: [
          {
            test: /.js$/,
            loader: 'babel',
            exclude: /node_modules/,
            query: {
              presets: ['es2015', 'react']
            }
          }
        ]
      }
    };

    You’ve specified that any file ending in .js should be processed with the Babel loader, except for those in the node_modules directory, using the es2015 and React Babel presets.

    You also need to install these modules using npm. However, you’ll need a package.json file first. To create one, run npm init. The values I used below are defaults or examples.

    $ npm init
    This utility will walk you through creating a package.json file.
    It only covers the most common items, and tries to guess sensible defaults.
    
    See `npm help json` for definitive documentation on these fields
    and exactly what they do.
    
    Use `npm install  --save` afterwards to install a package and
    save it as a dependency in the package.json file.
    
    Press ^C at any time to quit.
    name: (stormpath-spring-boot-react-example) 
    version: (1.0.0) 
    description: Spring Boot + React + Webpack
    entry point: (webpack.config.js) 
    test command: 
    git repository: (https://github.com/stormpath/stormpath-spring-boot-react-example.git) 
    keywords: 
    author: 
    license: (ISC) 
    About to write to /Users/mraible/dev/stormpath-spring-boot-react-example/package.json:
    
    {
      "name": "stormpath-spring-boot-react-example",
      "version": "1.0.0",
      "description": "Spring Boot + React + Webpack",
      "main": "webpack.config.js",
      "dependencies": {
        "babel-loader": "^6.2.7",
        "babel-preset-es2015": "^6.18.0",
        "babel-preset-react": "^6.16.0"
      },
      "devDependencies": {},
      "scripts": {
        "test": "echo \"Error: no test specified\" && exit 1"
      },
      "repository": {
        "type": "git",
        "url": "git+https://github.com/stormpath/stormpath-spring-boot-react-example.git"
      },
      "author": "",
      "license": "ISC",
      "bugs": {
        "url": "https://github.com/stormpath/stormpath-spring-boot-react-example/issues"
      },
      "homepage": "https://github.com/stormpath/stormpath-spring-boot-react-example#readme"
    }
    
    Is this ok? (yes) yes

    After creating package.json, you should be able to install the necessary dependencies.

    npm install --save-dev babel-loader babel-preset-es2015 babel-preset-react

    You might see some warnings about peer dependencies not being met. If you do, run the following command:

    npm install --save-dev babel-core webpack

    Now when you run webpack from the command line, you should see a success message!

    Use Loaders (React + Webpack)

    Add Dependencies

    If you refresh the page, you’ll see only the login button appears, unstyled. Loading the console shows that React was not found in bundle.js.

    React dependencies

    You need to import the libraries to app.js.

    import React from 'react';
    import ReactDOM from 'react-dom';
    import $ from 'jquery';
    import toastr from 'toastr';

    Also, pull the libraries in locally using npm.

    npm install --save react react-dom jquery toastr

    After running webpack again and refreshing your browser, you should see your component but without styling.

    Unstyled component

    Style Your CSS

    To get styling to work you once again need to install the required libraries using npm.

    npm install --save bootstrap

    Now import the style files directly into your app.js.

    import 'bootstrap/dist/css/bootstrap.css';
    import 'toastr/build/toastr.css';

    Finally, add loaders for both the CSS files and for any assets the CSS might reference.

    {
      test: /\.css$/,
      loader: 'style!css'
    },
    { 
      test: /\.(woff2?|ttf|eot|svg|png|jpe?g|gif)$/,
      loader: 'file'
    }

    style!css means first use style-loader and then pass it to css-loader. Install the loaders via npm.

    npm install --save-dev style-loader css-loader file-loader

    Now when you run webpack, you should see successful output and the various assets that were converted.

    Assets converted (React + Webpack)

    Refresh your page, and you should see the same, fully styled component as before.

    You’ve just converted your React/Bootstrap app over to webpack!

    Command Line Options for Webpack

    There are many things you can do with webpack. For example, to minify your bundles for production just add -p when you build.

    webpack -p

    You can also use -w to watch for changes and build automatically.

    Splitting Out CSS

    You might also want to split your CSS out into a separate bundle. This can be achieved with the ExtractTextPlugin. First, install the plugin via npm.

    npm install --save-dev extract-text-webpack-plugin

    Now change your CSS loader to use the plugin. Start by importing the plugin at the top of your webpack.config.js.

    var ExtractTextPlugin = require('extract-text-webpack-plugin');

    Then replace the CSS loader.

    { 
      test: /\.css$/, 
      loader: ExtractTextPlugin.extract("style-loader", "css-loader")
    }

    Finally, add a plugins section below module.

    module: {
    ...
    },
    plugins: [
      new ExtractTextPlugin("styles.css")
    ]

    Now you can update your HTML to pull in the new bundle.

    Integrate with Gulp

    If you need to automate external tasks you can integrate your webpack setup with Gulp. Start by installing Gulp and the webpack stream utility.

    npm install --save-dev gulp webpack-stream

    Then create the following gulpfile.js.

    var gulp = require('gulp');
    var webpack = require('webpack-stream');
    gulp.task('default', function() {
        return gulp.src('src/entry.js')
            .pipe(webpack(require('./webpack.config.js')))
            .pipe(gulp.dest('src/main/webapp/public'));
    });

    Now running gulp from the command line will run your webpack configuration as before.

    Gulp and your webpack configuration

    Components Everywhere!

    The prevailing ethos on designing the web is component-based. Applications are split into reusable pieces. React is a leader in this regard.

    Writing traditional HTML to support components doesn’t make sense. Manually pulling in assets, libraries such as jQuery or stylings like Bootstrap, quickly becomes a mess. Webpack solves this for you!

    I hope this tutorial has shown you how easy it is to configure webpack in your React application. To view the finished code, check out the GitHub repo.

    Or, if you’re interested in learning more about React or Spring Boot, check out one of these great resources:

    The post Optimize Your React Application with Webpack in 15 Minutes appeared first on Stormpath User Identity API.

    Store & Protect Sensitive Data in ASP.NET Core

    $
    0
    0

    Storing sensitive configuration data (i.e. API keys, database usernames, and passwords) appropriately is anything but a trivial concern for application developers. There are a variety of recommended approaches in ASP.NET Core, each with its own advantages and disadvantages. Making the right decision based on your application’s security requirements is critical, but not always simple. In this article, we will walk through different options for protecting your application secrets in both development and production environments.

    Security in Development

    It’s a poor (read: non-secure) practice to store sensitive data as constants in code or in plain text configuration files. You should always separate the configuration from the code because it varies between different environments, but the code should stay the same. This also is an obstacle if you want to make your code open source because all your sensitive data will be public.

    Application secrets should also not be part of your source control repository, for the same reason. If your code goes open source, secrets become public too. Let’s see how we can store this information in a way that avoids these problems.

    Environment Configurations

    ASP.NET Core allows you to manage different environment configurations using the appsettings.json file.

    The idea is simple: All the base configuration should be placed on the appsettings.json file. Then, you can add environment-specific configuration by creating additional configuration files where the name of each file contains the environment name they belong to, i.e. appsettings.development.json. Each of these files could be stored directly on each specific environment server.

    The current environment is set via the ASPNETCORE_ENVIRONMENT environment variable. The values Development, Staging, and Production are used by convention, but you can add any environment name you like, for example, jon_snow_staging.

    During application startup, the environment name will be determined from this environment variable and will load the environment-specific settings file:

    public Startup(IHostingEnvironment env)
    {
        var builder = new ConfigurationBuilder()
            .SetBasePath(env.ContentRootPath)
            .AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
            .AddJsonFile($"appsettings.{env.EnvironmentName}.json", optional: true);
    
        builder.AddEnvironmentVariables();
        Configuration = builder.Build();
    }

    The order in which configuration files are listed is important because if the same setting exists on more than one file, the last one is the one that prevails.

    Remember that these configuration files should not be part of your source control repository.

    This approach has some drawbacks: It doesn’t scale well with the number of environments the application should handle, as every environment needs its own configuration file. It’s also a little too easy to accidentally commit them to the source control repository, as they aren’t ignored by default. And, you if you want the source control to ignore them, you have to add them manually to the ignore list.

    Environment Variables

    A better approach to securing your config data is to store secrets in environment variables instead of local configuration files. During application startup you can call the AddEnvironmentVariables method to read all the values defined in the environment variables.

    As I mentioned before, the order of configurations on Startup is important. The suggested approach is to call this method last so that the local environment can override anything set in local configuration files.

    Environment variables are completely decoupled from code, so you can change their values without changing the deployed code. Also, unlike appsettings.json files, you cannot check them into the source control accidentally and they are independent from language and the underlying OS.

    But beware! Environment variables are usually not encrypted. So anyone with access to the system can read them. Also, if you need to change these values, you’ll need to restart your application in order to read the new values.

    Secrets Manager

    .NET Core provides a secrets manager tool which is an elegant option, if used ONLY in development, never production. This tool allows you to store your sensitive data locally on your machine, outside the project tree.

    The keys are stored in a JSON configuration file in the user profile directory, and the way to access them is similar to the previous approaches:

    public Startup(IHostingEnvironment env)
    {
        var builder = new ConfigurationBuilder()
         ...
         if (env.IsDevelopment())
         {
            builder.AddUserSecrets();
         }   
    
        builder.AddEnvironmentVariables();
        Configuration = builder.Build();
    }

    This tool is not super secure, and the keys are not encrypted, but it provides an easy way to avoid storing secrets in your project config files and having to remember to add them to the source control ignore list.

    Security in Production

    So far, we have reviewed some options for development. The first and second approaches (appsettings and environment variables) can be viably implemented in production too.
    These approaches could be a solid option if only medium-level security is required for your application, but if you need more security for your data, read on!

    Secure Secrets in Azure

    If you’re hosting your application on Azure, you’re able to store your secrets in the application settings of your Azure Web App. This configuration overwrites the settings you have in the configuration files, so if you have duplicated settings Azure will take precedence.

    Azure also provides a more secure option: Azure Key Vault. Key Vault is a cloud-hosted service for managing secrets, which will be accessible by the applications you authorize through an encrypted channel.

    Key Vault allows you to encrypt keys and secrets by using keys that are protected by hardware security modules (HSMs). An HSM is a device or module designed for the protection of the crypto key lifecycle.

    Fortunately, anyone with an Azure subscription can create and use key vaults. This approach requires a little bit of setup, which you can find instructions for on the Microsoft blog.

    Custom Security Providers

    If you decide to store your secrets in the appsettings.json file, you can easily encrypt and decrypt them via a custom configuration provider. Your custom provider should inherit from the ConfigurationProvider class. This class has a Load method that you should override and overwrite with your custom application configuration load logic.

    For example, suppose you have an encryption utility class and you create a CustomConfigProvider class that will decrypt all our configuration settings from a file and load it into Data using this class:

    public class CustomConfigProvider : ConfigurationProvider
        {
            public CustomConfigProvider() { }
    
            public override void Load()
            {
                Data = MyEncryptUtils.DecryptConfiguration();
            }
    }

    To hook it up on the ASP.NET Core pipeline, you need to define a custom IConfigurationSource. In the Build method you’re going to return the custom configuration provider:

    public class CustomConfigurationSource : IConfigurationSource
        {
            public CustomConfigurationSource() { }
    
    
            public IConfigurationProvider Build(IConfigurationBuilder builder)
            {
                return new CustomConfigProvider();
            }
        }

    Then, create an extension method to add the custom configuration source to the configuration builder:

    public static class CustomConfigurationExtensions
        {
            public static IConfigurationBuilder AddCustomConfiguration(this IConfigurationBuilder builder)
            {
                return builder.Add(new CustomConfigurationSource());
            }
        }

    And call it during the application startup:

    var builder = new ConfigurationBuilder()
                .AddJsonFile("appsettings.json")
                .AddCustomConfiguration()
                .AddJsonFile($"appsettings.{env.EnvironmentName}.json", optional: true);

    Guidelines and Next Steps

    As you can see, there are several options for storing and protecting your data, each with different levels of security and ease of use. No matter which you ultimately select, follow these basic guidelines:

  • Never commit secrets or sensitive data to a code repository, even a private repository
  • Do not store secrets or sensitive data in source code
  • Use an encryption mechanism to keep your secrets safe
  • I like the ASP.NET Core Secret Manager tool because it’s easy to use and helps developers to keep their secrets out of the source control, but this is only applicable in development. For production, I utilize environment variables for most of my medium-level security applications as they are suitable for any environment. Environment variables are also easy to work into most Continuous Integration workflows, as most CI servers have a way to implement them.

    What about you? Let us know what approach you’re using for your applications in the comments!

    Or, if you’re interested in learning more about secure user management in ASP.NET Core, check out any of the following resources:

  • User Authentication with Angular and ASP.NET Core
  • OpenID Connect for User Auth in ASP.NET Core
  • Our ASP.NET Core Token Authentication Guide
  • 5 Myths of Password Security
  • 3 Quick Ways to Increase Customer Data Security
  • The post Store & Protect Sensitive Data in ASP.NET Core appeared first on Stormpath User Identity API.

    CSRF Protection with JWTs in Spring Security

    $
    0
    0

    If you’ve never heard of JWTs (JSON Web Tokens), well, you don’t work in tech, or you’ve purposely unplugged your computer from the Internet. JWTs are frequently used in OAuth2 as access and refresh tokens as well as a variety of other applications.

    JWTs can be used wherever you need a stand-in to represent a “user” of some kind (in quotes, because the user could be another microservice). And, they’re used where you want to carry additional information beyond the value of the token itself and have that information cryptographically verifiable as security against corruption or tampering.

    For more information on the background and structure of JWTs, here’s the IETF specification

    The code that backs this post can be found on GitHub.

    Spring Security & CSRF Protection

    CSRF (Cross Site Request Forgery) is a technique in which an attacker attempts to trick you into performing an action using an existing session of a different website.

    Spring Security when combined with Thymeleaf templates, automatically inserts a token into all web forms as a hidden field. This token must be present on form submission, or a Java Exception is thrown. This mitigates the risk of CSRF as an external site (an attacker) would not be able to reproduce this token.

    For this sample project, the following dependencies are all that’s required to get Spring Boot, Spring Security, and Thymeleaf:

    <dependencies>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-thymeleaf</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-security</artifactId>
        </dependency>
    </dependencies>

    Here’s a simple Thymeleaf form:

    <!DOCTYPE html>
    <html lang="en" xmlns:th="http://www.thymeleaf.org">
        <body>
            <form method="post" th:action="@{/jwt-csrf-form}">
                <input type="submit" class="btn btn-primary" value="Click Me!"/>
            </form>
        </body>
    </html>

    Notice the xmlns:th attribute in the html tag as well as th:action attribute of the form tag. It’s these attributes that triggers Spring Security to inject the CSRF protection token into the form. Here’s what that looks like:

    <input type="hidden" name="_csrf" value="72501b07-8205-491d-ba95-ebab5cf450de" />

    This is what’s called a “dumb” token. Spring Security keeps a record of this token, usually in the user’s session. When the form is submitted, it compares the value of the token to what Spring Security has on record. If the token is not present or is not the right value, an Exception is thrown.

    We can improve on this using a JWT in the following ways:

    • Ensure that a given token can only be used once by using a nonce cache
    • Set a short expiration time for added security
    • Verify that the token hasn’t been tampered with using cryptographic signatures

    Switching to JWTs for CSRF Protection

    The JJWT (Java JWT) library is the premier open-source Java library for working with JSON Web Tokens. It’s clean design including a fluent interface has led to over 1,000 stars on Github.

    We can easily add the JJWT library to our project by dropping in the following dependency:

    <dependency>
        <groupId>io.jsonwebtoken</groupId>
        <artifactId>jjwt</artifactId>
        <version>${jjwt.version}</version>
    </dependency>

    Spring Security makes it easy to override the default CSRF behavior. We add three components to make this happen:

    • CSRF Token Repository
    • CSRF Token Validator
    • Spring Security Configuration

    CSRF Token Repository

    Implementing the CsrfTokenRepository interface requires three methods: generateToken, saveToken, and loadToken.

    Here’s our generateToken method:

    @Override
    public CsrfToken generateToken(HttpServletRequest request) {
        String id = UUID.randomUUID().toString().replace("-", "");
    
    
        Date now = new Date();
        Date exp = new Date(now.getTime() + (1000*30)); // 30 seconds
    
    
        String token = Jwts.builder()
            .setId(id)
            .setIssuedAt(now)
            .setNotBefore(now)
            .setExpiration(exp)
            .signWith(SignatureAlgorithm.HS256, secret)
            .compact();
    
    
        return new DefaultCsrfToken("X-CSRF-TOKEN", "_csrf", token);
    }

    Here we see the JJWT fluent interface in action. We chain all the claims settings together and call the compact terminator method to give us the final JWT string. Most importantly, this JWT will expire after 30 seconds.

    The saveToken and loadToken methods do just what they say. In this example, they are saving the token to and loading the token from the user’s session.

    CSRF Token Validator

    Spring Security will already do the “dumb” part of the CSRF check and verify that the string it has stored matches the string that’s passed in exactly. In addition, we want to leverage the the information encoded in the JWT. This is implemented as a filter.

    Here’s the core of the JwtCsrfValidatorFilter:

    // CsrfFilter already made sure the token matched. Here, we'll make sure it's not expired
    try {
        Jwts.parser()
            .setSigningKeyResolver(secretService.getSigningKeyResolver())
            .parseClaimsJws(token.getToken());
    } catch (JwtException e) {
        // most likely an ExpiredJwtException, but this will handle any
        request.setAttribute("exception", e);
        response.setStatus(HttpServletResponse.SC_BAD_REQUEST);
        RequestDispatcher dispatcher = request.getRequestDispatcher("expired-jwt");
        dispatcher.forward(request, response);
    }

    If the JWT is parseable, processing will continue. As you can see in the catch block, if parsing the JWT fails for any reason, we forward the request to an error page.

    Spring Security Configuration

    The Spring Security configuration ties it all together by registering our CSRF Token Repository with Spring Security. Here’s the configure method:

    protected void configure(HttpSecurity http) throws Exception {
        http
            .addFilterAfter(new JwtCsrfValidatorFilter(), CsrfFilter.class)
            .csrf()
                .csrfTokenRepository(jwtCsrfTokenRepository)
                .ignoringAntMatchers(ignoreCsrfAntMatchers)
            .and()
            .authorizeRequests()
                .antMatchers("/**")
                .permitAll();
    }

    Line 3 adds our validator filter after the default Spring Security CSRF Filter.

    Line 6 tells Spring Security to use our JWT CSRF Token Repository instead of the default one.

    JWT CSRF Protection in Action

    To run the sample app, clone the GitHub repo and execute:

    cd JavaRoadStorm2016/roadstorm-jwt-csrf-tutorial
    mvn clean install
    mvn spring-boot:run

    In your browser, you can go to: http://localhost:8080. You will see a humble button:

    Click me!

    If you view source, you can see how things are setup:

    Source setup for CSRF

    NOTE: When you view source, it will invalidate the token in the page since a new one is fetched to show source. Just refresh the original page.

    If you wait for more than 30 seconds and click the button, you will see an error:

    JWT CSRF Token Expired

    Next Up: Securing Microservices with JWTs

    In this post, we’ve seen the benefit of using JWTs for CSRF protection with Spring Security.

    JWTs are very powerful for general token use due to their inherent smarts – the encoded claims within them as well as their ability to be cryptographically verified.

    In the next post, we’ll dive a little deeper in an example for establishing trust and communicating between microservices. Have questions or feedback? Leave a comment or hit me up on Twitter @afitnerd!

    The post CSRF Protection with JWTs in Spring Security appeared first on Stormpath User Identity API.

    Securing JSPs with Spring Security and Stormpath

    $
    0
    0

    Even though JSPs have fallen out of fashion lately, they are still a core part of many enterprise infrastructures. In this tutorial, we’ll show you how to secure them using the excellent Spring Security suite and Stormpath’s Spring Boot integration for user management.

    JSPs and Spring Boot

    We’ll start by serving up a simple JSP homepage using Spring Boot. The most basic JSP is plain HTML with a variable or two thrown in.

    <!doctype html>
    <html>
    <body>
      <p>Hello, <%="World"%>!</p>
    </body>
    </html>

    If rendered correctly we should see ‘Hello, World!’ when loading the code. To do this we first put it into the src/main/webapp/WEB-INF/jsp directory.

    JSP Directory

    Then inside of application.properties we need to specify where our JSP files reside.

    spring.mvc.view.prefix: /WEB-INF/jsp/
    spring.mvc.view.suffix: .jsp

    Now create a basic Application.java.

    @SpringBootApplication
    public class Application  {
        public static void main(String[] args) {
            SpringApplication.run(Application.class, args);
        }
    }

    And then create an even more basic Request.java.

    @Controller
    public class Request {
    
        @GetMapping("/")
        String home() {
            return "home";
        }
    }

    Mapping a forward slash to home really means the file home.jsp with the path defined in our application.properties.

    The last thing we need is a pom.xml. Here we include the Spring Boot Starter Parent (so we don’t have to specify dependency versions), the Spring Boot Maven Plugin (so we can run our app directly), the Tomcat Jasper (to serve our JSPs) and the Spring Boot Starter Web (for route controlling).

    <project>
        <modelVersion>4.0.0</modelVersion>
    
        <groupId>com.stormpath.sample</groupId>
        <artifactId>stormpath-security</artifactId>
        <version>0.1.0</version>
    
        <parent>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-parent</artifactId>
            <version>1.4.0.RELEASE</version>
        </parent>
    
        <dependencies>
            <dependency>
                <groupId>org.apache.tomcat.embed</groupId>
                <artifactId>tomcat-embed-jasper</artifactId>
                <scope>provided</scope>
            </dependency>
            <dependency>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-starter-web</artifactId>
            </dependency>
        </dependencies>
    
        <build>
            <plugins>
                <plugin>
                    <groupId>org.springframework.boot</groupId>
                    <artifactId>spring-boot-maven-plugin</artifactId>
                </plugin>
            </plugins>
        </build>
    </project>

    Now when we run with mvn spring-boot:run and visit localhost:8080 we should see our hello world message.

    JSP Hello World

    Add Security

    Spring Security is divided into two parts, authentication, and authorization. The first ensures your user identity while the second handles what the user can and can’t do.

    To authenticate we are going to use Stormpath’s Java integration which is as simple as adding http.apply(stormpath()) to a basic security adapter.

    @Configuration
    public class Security extends WebSecurityConfigurerAdapter {
        @Override
        protected void configure(HttpSecurity http) throws Exception {
            http.apply(stormpath());
        }
    }

    We also need to add our Stormpath application details to our application.properties.

    stormpath.application.href = <your app href>
    stormpath.client.apiKey.id = <your api key id>
    stormpath.client.apiKey.secret = <your api key secret>

    Note: For security reasons you should not store your Stormpath keys inside of Java files. Rather use environment variables. See here.

    In addition we need to add the Stormpath starter to our pom.xml.

    <dependency>
        <groupId>com.stormpath.spring</groupId>
        <artifactId>stormpath-default-spring-boot-starter</artifactId>
        <version>1.1.3</version>
    </dependency>

    However, this includes Thymeleaf templating which clashes with Jasper (which we use to serve our JSPs). We need to exclude it. Instead of the above code use the following.

    <dependency>
        <groupId>com.stormpath.spring</groupId>
        <artifactId>stormpath-default-spring-boot-starter</artifactId>
        <version>1.1.3</version>
         <exclusions>
            <exclusion>
                <groupId>com.stormpath.spring</groupId>
                <artifactId>stormpath-thymeleaf-spring-boot-starter</artifactId>
            </exclusion>
        </exclusions> 
    </dependency>

    If you restart your application and refresh the homepage you should see Stormpath trying to log you in. This is because we haven’t specified access controls per page. By default everything is locked down.

    Login

    Typing in the details for a user account in your Stormpath application should take you to the ‘Hello, World!’ message as before.

    Authorization

    Finally to illustrate authorization we will use Spring Security’s JSP tag libraries which allows us to control behaviour based on Access Expressions (we’ll deal with those in a second).

    First we add the tag libraries to our pom.xml.

    <dependency>
        <groupId>org.springframework.security</groupId>
        <artifactId>spring-security-taglibs</artifactId>
    </dependency>

    Next we import the library to our JSP by adding the following line to the top.

    <%@ taglib prefix="sec" uri="http://www.springframework.org/security/tags" %>

    With that we can use the special sec tags to control how the JSP is rendered. For example take the following section of code.

    <sec:authorize access="hasAuthority('supervisor')">
    
    This content will only be visible to users who have
    the "supervisor" authority in their list of <tt>GrantedAuthority</tt>s.
    
    </sec:authorize>

    Only users with authority supervisor will be able to see the code in between the tags.

    Role, Authority, and Access Expressions

    Spring Security allows you to define policies on methods and attributes using the Spring Expression Language. For example you could use the @PreAuthorize annotation to specify whether a user can access a function.

    @PreAuthorize("hasRole('ROLE_USER')")
    public void create(Contact contact);

    Part of the expression language is the concept of Role and Authority Spring Security uses in order to associate rights to users. To illustrate this take a look at the following xml config excerpt used before the Java configurations that are popular today today.

    <http auto-config="true">
        <intercept-url pattern="/admin**" access="ROLE_ADMIN" />
        <intercept-url pattern="/dba**" access="ROLE_ADMIN,ROLE_DBA" />
    </http>

    This is how we would say who was allowed to see the /admin and /dba urls. It is not per user but per role. We then have to define who has what role when creating users.

    <authentication-manager>
        <authentication-provider>
            <user-service>
                <user name="karl" password="123456" authorities="ROLE_USER" />
                <user name="admin" password="123456" authorities="ROLE_ADMIN" />
                <user name="dba" password="123456" authorities="ROLE_DBA" />
            </user-service>
        </authentication-provider>
    </authentication-manager>

    Note: In Stormpath we use hasAuthority throughout. This is because hasRole is the same as hasAuthority but just with the string ‘ROLE_’ prepended. See details of our Spring Boot migration for more.

    Stormpath Groups

    Groups in Stormpath

    Groups in Stormpath are analogous to Spring Security authorities. And to specify a group we use the href shown when you open the group up in the Stormpath Admin Console.

    <sec:authorize access="hasAuthority('https://api.stormpath.com/v1/groups/<your href here>')">

    If we add the referenced group to the user we logged in with previously and reload, we should see the message inside the sec tags.

    New secure Hello World

    In this way, we can control who is allowed to access which sections of code such as buttons and forms.

    JSP + Stormpath FTW

    Hopefully, you can see how easy it is to secure your JSPs using Stormpath and Spring Security. With just a few files you can create a flexible system that allows centrally managing users and controlling what they can see and do based on the groups they belong to.

    Interested in learning more about user management with Stormpath? Well you’re in luck, because we’ve got everything you need to get started!

    The post Securing JSPs with Spring Security and Stormpath appeared first on Stormpath User Identity API.

    Build an Angular 2 Application with User Authentication in 10 Minutes

    $
    0
    0

    Today I’m happy to announce the first (beta) release of Stormpath’s Angular 2 support! The npm module is called angular-stormpath and you can easily installing it using npm install --save angular-stormpath. If you’d like to try Angular 2 with Stormpath without writing any code, you can checkout the project from GitHub and run its demo. You will need to have your Stormpath API key setup for this to work.

    git clone https://github.com/stormpath/stormpath-sdk-angular.git
    cd stormpath-sdk-angular
    npm install
    npm start

    If you’d like to learn how to integrate our Angular 2 components into your own application, continue reading!

    What Is Stormpath?

    Stormpath is an API service that allows developers to create, edit, and securely store user accounts and user account data, and connect them with one or multiple applications. We make user account management a lot easier, more secure, and infinitely scalable. To get started register for a free account.

    Create an Angular 2 Application with Express

    To see how you might use this in a simple Angular 2 application, create a new application with Angular CLI. First, you’ll need to install Angular CLI.

    npm install -g angular-cli

    After this command completes, you can create a new application.

    ng new angular2-express-stormpath-example

    The reason I included “express” in the project name is because Stormpath currently requires one of our backend integrations to communicate with Stormpath’s API. For this example, you’ll use express-stormpath.

    ng new

    From the command line, cd into angular2-express-stormpath-example and run ng e2e. All tests should pass and you should see results like the following.

    ng e2e

    Integrate Stormpath’s Angular 2 Support

    Add angular-stormpath to the project:

    npm install angular-stormpath --save

    In src/app/app.component.html, add HTML that shows a welcome message to the user when they’re logged in. When they’re not logged in, the <sp-authport></sp-authport> component will render forms to register, login, and retrieve forgotten passwords.

    {{title}}

    Welcome, ({{ ( user$ | async ).fullName }}).


    What would you like to do?

    In src/app/app.component.ts, add the following variables, constructor, and methods to the body of AppComponent:

    import { Stormpath, Account } from 'angular-stormpath';
    import { Observable } from 'rxjs';
    ...
    export class AppComponent {
      title = 'app works!';
      private user$: Observable;
      private loggedIn$: Observable;
      private login: boolean;
      private register: boolean;
    
      constructor(public stormpath: Stormpath) {
      }
    
      ngOnInit() {
        this.login = true;
        this.register = false;
        this.user$ = this.stormpath.user$;
        this.loggedIn$ = this.user$.map(user => !!user);
      }
    
      showLogin() {
        this.login = !(this.register = false);
      }
    
      showRegister() {
        this.register = !(this.login = false);
      }
    
      logout() {
        this.stormpath.logout();
      }

    If you run npm start and view http://localhost:4200 in your browser, you’ll see “Loading…”, but nothing renders. A quick check of the console will show you errors about sp-authport not being a known element.

    sp-authport error

    This happens because Stormpath’s Angular 2 components haven’t been imported into the application’s module. Open src/app/app.module.ts and import StormpathModule.

    import { AppComponent } from './app.component';
    import { StormpathModule } from 'angular-stormpath';
    
    @NgModule({
      declarations: [
        AppComponent
      ],
      imports: [
        BrowserModule,
        FormsModule,
        HttpModule,
        StormpathModule
      ],
      providers: [],
      bootstrap: [AppComponent]
    })

    Now the app should launch correctly, but you’ll see a 404 in your console for the /me endpoint.

    /me 404

    Install Stormpath’s Express Support

    To fix this, install express-stormpath:

    npm install express-stormpath --save-dev

    Create a server directory and a server.js file in it. In this file, create an Express application and protect it with Stormpath.

    'use strict';
    
    var express = require('express');
    var path = require('path');
    var stormpath = require('express-stormpath');
    
    /**
     * Create the Express application.
     */
    var app = express();
    
    /**
     * The 'trust proxy' setting is required if you will be deploying your
     * application to Heroku, or any other environment where you will be behind an
     * HTTPS proxy.
     */
    app.set('trust proxy', true);
    
    /*
     We need to setup a static file server that can serve the assets for the
     angular application.  We don't need to authenticate those requests, so we
     setup this server before we initialize Stormpath.
     */
    
    app.use('/', express.static(path.join(__dirname, '..'), {redirect: false}));
    
    app.use(function (req, res, next) {
      console.log(new Date, req.method, req.url);
      next();
    });
    
    /**
     * Now we initialize Stormpath, any middleware that is registered after this
     * point will be protected by Stormpath.
     */
    console.log('Initializing Stormpath');
    
    app.use(stormpath.init(app, {
      web: {
        // produces: ['text/html'],
        spa: {
          enabled: true,
          view: path.join(__dirname, '..', 'index.html')
        },
        me: {
          // enabled: false,
          expand: {
            customData: true,
            groups: true
          }
        }
      }
    }));
    
    /**
     * Now that our static file server and Stormpath are configured, we let Express
     * know that any other route that hasn't been defined should load the Angular
     * application.  It then becomes the responsibility of the Angular application
     * to define all view routes, and redirect to the home page if the URL is not
     * defined.
     */
    app.route('/*')
      .get(function (req, res) {
        res.sendFile(path.join(__dirname, '..', 'index.html'));
      });
    
    /**
     * Start the web server.
     */
    app.on('stormpath.ready', function () {
      console.log('Stormpath Ready');
    });
    
    var port = process.env.PORT || 3000;
    app.listen(port, function () {
      console.log('Application running at http://localhost:' + port);
    });

    This will now service the Stormpath endpoints (e.g. /login, /logout, /me) on port 3000 when it’s started. However, since Angular CLI runs on port 4200, you have to proxy these requests.

    Proxy Requests to Express

    Create a proxy.conf.json file in the root directory of the project to contain proxy definitions.

    {
      "/forgot": {
        "target": "http://localhost:3000",
        "secure": false
      },
      "/login": {
        "target": "http://localhost:3000",
        "secure": false
      },
      "/logout": {
        "target": "http://localhost:3000",
        "secure": false
      },
      "/me": {
        "target": "http://localhost:3000",
        "secure": false
      },
      "/register": {
        "target": "http://localhost:3000",
        "secure": false
      }
    }

    Next, change package.json to modify npm start to start express and run ng serve with proxy support.

    "scripts": {
        "start": "concurrently --raw \"ng serve --proxy-config proxy.conf.json\" \"node server/server.js stormpath-api\"",
        "lint": "tslint \"src/**/*.ts\"",
        "test": "ng test",
        "pree2e": "webdriver-manager update",
        "e2e": "protractor"
      },

    You’ll need to install concurrently to make this command work.

    npm install concurrently --save-dev

    Run npm start and you should be able to navigate between Login, Register, and Forgot Password in your browser. The forms won’t be pretty though. You can make them look good by adding Bootstrap to src/index.html.

    login

    register forgot password

    If you haven’t registered for Stormpath, you’ll see an error like the following when you run npm start. The Quickstart guide for Stormpath’s Node.js integration explains how to register and create an API key pair.

    No API Key Error

    Fix Tests

    If you try to run npm test or ng test, tests will fail with the same error you saw before:

    'sp-authport' is not a known element:
    1. If 'sp-authport' is an Angular component, then verify that it is part of this module.

    The first step to fixing this is to import StormpathModule into src/app/app.component.spec.ts.

    beforeEach(() => {
      TestBed.configureTestingModule({
        declarations: [
          AppComponent
        ],
        imports: [StormpathModule]
      });
    });

    This will get you a bit further, but there will be an error about the /me endpoint not being found.

    Chrome 54.0.2840 (Mac OS X 10.12.1) ERROR
      Uncaught Error: /me endpoint not found, please check server configuration.

    To workaround this, you can override the Angular’s Http dependency and mock out its backend.

    import { StormpathModule, Stormpath } from 'angular-stormpath';
    import { BaseRequestOptions, Http, ConnectionBackend } from '@angular/http';
    import { MockBackend } from '@angular/http/testing';
    ...
    beforeEach(() => {
      TestBed.configureTestingModule({
        declarations: [AppComponent],
        imports: [StormpathModule],
        providers: [
          {
            provide: Http, useFactory: (backend: ConnectionBackend, defaultOptions: BaseRequestOptions) => {
            return new Http(backend, defaultOptions);
          },
            deps: [MockBackend, BaseRequestOptions]
          },
          {provide: Stormpath, useClass: Stormpath},
          {provide: MockBackend, useClass: MockBackend},
          {provide: BaseRequestOptions, useClass: BaseRequestOptions}
        ]
      });
    });

    After making these changes, you should see the sweet smell of success.

    Chrome 54.0.2840 (Mac OS X 10.12.1): Executed 3 of 3 SUCCESS (0.536 secs / 0.532 secs)

    Protractor tests should still work as well. You can prove this by running npm start in one terminal and npm run e2e in another.

    Kudos

    Thanks to Stormpath’s Robert Damphousse for providing a preview of Angular 2 support and writing most of the code in this release. I’d also like to thank Matt Lewis for his generator-angular2-module. Matt’s library made it easy to create this module and he was a great help in getting tests to work.

    Angular 2 + Express Source Code

    A completed version of the application created in this blog post is available on GitHub.

    I hope you’ve enjoyed this quick tour of our Angular 2 support. If you have any questions about features or our roadmap going forward, please hit me up on Twitter, leave a comment below, or open an issue on GitHub.

    The post Build an Angular 2 Application with User Authentication in 10 Minutes appeared first on Stormpath User Identity API.

    The Architecture of Stormpath’s Java SDK

    $
    0
    0

    Stormpath provides several language-specific SDKs to allow simple interaction with its REST API. The Java SDK is one of our most popular ones. In this article, we’ll dive under the hood and take a closer look at the architecture of the Java SDK. First of all, you might wonder, why did you develop an SDK?

    Well, that’s simple. We absolutely believe that our ecosystem of SDKs and integrations makes developer’s lives easier, and that’s our mission. We started Stormpath because auth is tough if you are a security expert, and can be impossible to get right if you’re not. Fast forward three years and now Stormpath offers 13 different SDKs.

    Stormpath SDKs

    The SDKs have been extremely popular among developers. The reason is simple, they can add just a few lines of code and be integrated with Stormpath in minutes!

    From REST to Java

    The Java SDK does its best to keep developers in mind. As developers of the SDK, this isn’t too difficult since we’re developers too! The Core Java SDK is largely a wrapper around the REST API, which can also be used if you want to experiment from the command line, or write your own SDK.

    TIP: Les Hazlewood wrote a blog post about how we migrated our backend to Spring Boot in 3 weeks.

    To create a user account with the REST API, you can use curl:

    curl --request POST \
      --user $SP_API_KEY_ID:$SP_API_KEY_SECRET \
      --header 'content-type: application/json' \
      --url "https://api.stormpath.com/v1/applications/1gk4Dxzi6o4PbdlEXampLE/accounts"
      --data '{
      "givenName": "Joe",
      "surname": "Stormtrooper",
      "username": "tk421",
      "email": "tk421@stormpath.com",
      "password":"Changeme1"
      }'

    Using the Java SDK to complete this same task might look more familiar to Java developers:

    //Create the account object
    Account account = client.instantiate(Account.class);
    
    //Set the account properties
    account.setGivenName("Joe")
        .setSurname("Stormtrooper")
        .setUsername("tk421") //optional, defaults to email if unset
        .setEmail("tk421@stormpath.com")
        .setPassword("Changeme1");
    
    //Create the account using the existing Application object
    account = application.createAccount(account);

    API vs Implementation

    The Java SDK defines clear lines between its API and implementation. The project on GitHub shows these are both top-level directories.

    Java SDK on GitHub

    The api module contains the interfaces that developers will interact with. The project uses semantic versioning, which means these classes will not add or remove methods between patch releases. We are at liberty to add classes and methods between minor and major releases.

    The impl module is a little more fluid in that we’re allowed to change things between minor releases, as long as we maintain backwards compatibility. The classes in this module aren’t exposed to end users as much and we caution developers not to cast and use implementation classes.

    You can think about the API as a JDBC library. When you write Java code to interact with a database you will be writing generic JDBC statements. During development time, those statements are not tied to any particular DB at all. Later on, at runtime, a specific concrete DB library will be used by JDBC in order to interact with the concrete DB. You wrote (JDBC) code which allows you to switch between different DBs without (ideally) affecting your code.

    Our Java SDK separation between an api and an impl module allows your code to remain completely agnostic of the concrete classes and operations we use in order to interact with the backend. The backend does change from time to time and we need to be sure that our SDK can keep properly interacting with it. Therefore, we can (ideally) change our impl classes and your code will not need to be modified.

    Stack

    Our SDK is built from the ground up with a modular architecture in mind. The overall architecture is a stack where each module is in charge of a specific responsibility. Said module is available to be used not only by developers but also by other modules that provide higher level functions.

    Java SDK Stack

    Having such an architecture allows each module to have a small footprint and to re-use much of the already existing code. For instance, take a look at our Spring Boot Starter module. It is only a single Java file! This is the rationale behind how our logic is separated:

    • Someone building a standalone Java application can rely on: Stormpath Java SDK
    • If you are building a web application in a non-spring environment: Stormpath Servlet Plugin
    • If you want to build a Spring Web application: Stormpath Spring WebMVC
    • Do you want to have Spring Security in your non-web Spring Boot application?: Stormpath Spring Boot Starter Security
    • You prefer to run Stormpath as a Gateway rather than having it inside your web application?: Stormpath Zuul Gateway
    • And so on…

    Network-agnostic

    Stormpath is a user management service that is hosted in the cloud. This means that the Java SDK needs to interact with a remote service that is running outside of your domain. Many developers assume that they will need deal with some network-related coding at some point. The good news is that they are wrong. 🙂

    Our SDK completely abstracts the network nature of Stormpath. You will not need to do anything network-related for the SDK to be fully functional. Our SDK will automatically communicate with our REST API where data will be securely and efficiently transported for you. Remote operations (like login, data updates, etc) will happen behind the scenes without you even noticing they are traveling through the wire. The SDK caches most of the data locally, so operations are efficient and I/O is done in a responsible way.

    Integrations

    The rest of this post is a deep-dive into working with the Java SDK. We’ve used the SDK to build up the integrations pictured above.

    If you are not interested in close-to-the-metal software development using the Java SDK or you’re already using one of the Spring variants or Servlet in your project – good news!

    Using one of the many integrations we provide enables you to use Stormpath with little to no additional coding on your part.

    For instance, if you drop the stormpath-webmvc-spring-boot-starter module into your project, your Spring Boot app gets an /oauth/token endpoint with absolutely no additional coding on your part.

    Wanna get down and dirty with the Java SDK? Read on…

    Internal Concepts

    Data Model

    The main objective of the Java SDK is to provide a Java idiomatic development experience to interact with our Rest API-based backend service. This implies that the operations that it provides must be analogous to what can be done via a REST command hitting the backend directly.

    In order to accomplish that we modeled the resources provided by the backend. All of these resources are Java interfaces and their hierarchy is as follows:

    Java SDK Data Model

    • A Resource represents any entity that actually exists in the backend. This means that is has an href (i.e the ID that univocally identifies it for perpetuity)
      • Accounts, Tenants, Groups… etc are all Resources
    • Entities that can hold Accounts are AccountStores.
      • A Directory is a real account holder since accounts exists only inside the directory that holds them. This means that when the directory is deleted the account will also stop existing
      • A Group is a virtual repository where Accounts can be added and removed without really affecting their existence
    • An AccountStoreHolder denotes a resource capable of referencing account stores.
      • For example Applications and Organizations.
      • You can think of an Organization as a ‘virtual’ AccountStore that ‘wraps’ other AccountStores

    Executing Operations in the Backend

    The above class diagram provides a simplified view of the data provided by the SDK which mimics what the Rest API provides in the backend. Hopefully, you now have an understanding of how the Resources are represented in Java. There is still one important aspect missing: the operations that they support.

    All the operations are provided by Java methods available via each corresponding resource. For example, in order to update the username of an Account you would do a POST with REST like this (using HTTPie):

    http -a APIKEY_ID_HERE:APIKEY_SECRET_HERE \
    POST https://api.stormpath.com/v1/accounts/1GFIRBu2pAE3POp0kE3ekE username=fooBar

    However with the Java SDK the operation will be as simple as:

    account.setUsername(“fooBar”);
    account.save()

    Or, in order to create an Application you would execute the following curl command:

    curl -X POST --user APIKEY_ID_HERE:APIKEY_SECRET_HERE \
         -H "Accept: application/json" \
         -H "Content-Type: application/json" \
         -d '{ "name" : "FooApp" }' \
    https://api.stormpath.com/v1/applications

    While with the Java SDK you can simply do:

    Application  app = client.instantiate(Application.class);
    app.setName("FooApp”);
    client.createApplication(app);

    These code snippets are useful to exemplify how some of the operations are available via the specific resources:

    • Resources’ properties are applied via the corresponding setters
    • The analogous operation to the POST method is save()
    • An Application can be created via the Client instance which provides a createApplication operation (among others)
    • Tenant credentials are not needed, the SDK takes care of that behind the scenes

    Of course there are many more types of operations that the backend provides, like:

    The intention of this section was to show a simple example of how the REST API operations are mapped in the Java SDK. You can read more about all the available operations in our Java Product Guide.

    Volatile State

    Our Java SDK works by modifying data that actually resides in our backend. The SDK does not have a state per-se but it does keep data in a cache in order to improve processing speed. Our default cache mechanism keeps data in memory.

    This volatile state is not shared by different instances. Each process will have its own independent state, all of them modifying the same data in Stormpath’s backend but that data is not proactively pushed to the SDK in any way. The data is only available locally when retrieved (pulled) by the SDK when needed. These data is later on updated locally only via write operations or cache timeouts. This means that the backend never pushes data to the SDK, the SDK has to pull it instead.

    If your application requires different instances of the Java SDK to work concurrently in a distributed environment then you will need to use a distributed cache like Hazelcast. The good news is that we already provide a Hazelcast implementation which is readily available to be used as well.

    Developer API

    As its name denotes the Java SDK is meant to be used by developers to create their own applications enriched with Stormpath-related functions. They will therefore need to write code in order to programmatically interact with us. Our programming model therefore needs to be simple to use, where simple means: well documented, consistent in experience, and intuitive.

    Our Javadocs are your friend and can be found here. Jumping right into the javadocs can be daunting, however. Let’s take a look at some examples that exemplify the consistent experience in the Java SDK.

    Patterns

    The first thing you need in order to work with Stormpath from the Java SDK is a Client. The easiest way to get a hold of a Client is like so:

    Client client = Clients.builder().build();

    NOTE: Creating a client is something you should only do once in your application. It references an internal CacheManager and creating multiple copies could create state management issues.

    There’s a lot of Java SDK goodness hidden in that one line! You can see we use a builder pattern for creating objects. The builder pattern enables the creation of objects without explicit instantiation (using the new keyword). This is very important to the design of the SDK as it enables the separation of the api layer from the impl layer.

    Let’s see how you can get some more out of Client by providing some configuration.

    Configuration

    In order to create a Client object, the Java SDK needs to know the base URL for the Stormpath environment you are working with and it needs to have a set of API keys in order to authenticate against the Stormpath API backend.

    The Java SDK uses some sensible defaults to reduce the amount of coding you need to do. It will automatically look for the API keys in: ~/.stormpath/apiKey.properties. And, it will use the community cloud base URL by default: https://api.stormpath.com/v1.

    The one-liner above would work without alteration under these circumstances. However, the fluent interface and the builder pattern keep the code very readable if you are not using these defaults.

    ApiKey apiKey = ApiKeys.builder()
        .setFileLocation("/path/to/apiKey.properties")
        .build();
    
    Client client = Clients.builder()
        .setApiKey(apiKey)
        .setBaseUrl("https://enterprise.stormpath.io/v1")
        .build();

    In this scenario, your API key file is in an alternate location (/path/to/apiKey.properties) and you’re using Stormpath’s Enterprise environment (https://enterprise.stormpath.io/v1).

    Notice how even the ApiKeys class uses the builder pattern – consistency!

    Let’s take a look at how you can use a request pattern to interact further with the Java SDK.

    OAuth 2.0 Requests

    Amongst the most powerful features of Stormpath is our OAuth 2.0 service. The Java SDK provides a consistent interface for working with common OAuth2 workflows. For a more in-depth look at OAuth2, look here.

    Building on the previous section, use the Client to get a hold of an Application. You can make the application interact with OAuth2 quite easily.

    Client client = Clients.builder().build();
    
    Application application = client
        .getApplications(where(name().eqIgnoreCase("My Application")))
        .single();

    The above code shows how you can search for an Application by name using criteria. This code example uses a fluent pattern; one that is used for collections all throughout the Java SDK. The fluent interface allows for method chaining and a method terminator to return a concrete object (an Application in this case).

    One of the common OAuth2 flows is obtaining access and refresh tokens. Another common flow is using a refresh token to obtain a new access token (its only purpose in life).

    OAuthGrantRequestAuthentication request = OAuthRequests.OAUTH_PASSWORD_GRANT_REQUEST.builder()
        .setLogin("me@me.com")
        .setPassword("super_secret")
        .build();
    
    OAuthGrantRequestAuthenticationResult result = Authenticators.OAUTH_PASSWORD_GRANT_REQUEST_AUTHENTICATOR
        .forApplication(application)
        .authenticate(request);
    
    String accessToken = result.getAccessTokenString();
    String refreshToken = result.getRefreshTokenString();

    There are two distinct parts in the code above: (1) building a request (line 1) and (2) authenticating using the request (line 6).

    See the fluent interface and builder patterns at work?

    Now, look at how you can refresh the access token:

    OAuthGrantRequestAuthentication refreshRequest = OAuthRequests.OAUTH_REFRESH_TOKEN_REQUEST.builder()
        .setRefreshToken(refreshToken)
        .build();
    
    result = Authenticators.OAUTH_REFRESH_TOKEN_REQUEST_AUTHENTICATOR
        .forApplication(application)
        .authenticate(refreshRequest);

    Look familiar? Once again, this builds a request and gets a result.

    Pop-quiz: What do all of the above code examples NOT have? Answer: the new keyword. If you stick to using the api packages and the SDK design patterns, we can improve the implementations without you ever having to update your code.

    That’s why we recommend having the api module as a compile-time dependency and the impl module as a runtime dependency. Here’s what that looks like in a pom.xml file:

    
    
        4.0.0
        ...
        
            
                com.stormpath.sdk
                stormpath-sdk-api
                ${stormpath.version}
                compile 
            
            
                com.stormpath.sdk
                stormpath-sdk-httpclient
                ${stormpath.version}
                runtime
            
        
        ...
    

    You can see by now that fluent interface, builder pattern, request pattern, and search criteria are used all over to provide a consistent and readable developer experience with the Java SDK.

    Kudos

    Thanks to Stormpath’s Mario Antollini and Micah Silverman for writing most of the this post and the Java SDK itself.

    Summary

    Stormpath’s Java SDK was built to help developers create their own applications enriched with Stormpath-related functionality. Our programming model uses a style that’s familiar to Java developers. You can easily invoke REST commands to the backend, without worrying about networking or connectivity. Its modular design allows you to pick the component you need and the web integrations don’t require you to write any code. Finally, it’s well documented with tutorials and examples, as well as Javadocs.

    Stormpath’s Java SDK is fully open source on GitHub, with an Apache 2.0 license.

    The post The Architecture of Stormpath’s Java SDK appeared first on Stormpath User Identity API.


    String Interpolation with Apache Shiro

    $
    0
    0

    I am happy to announce the the 0.8.0-RC1 release of our Stormpath-Shiro integration.
    This release builds on top of the recent Apache Shiro 1.4.0-RC2 release.

    The 1.4.0 Apache Shiro release adds a handful of great features:

    Of course that’s not all, in the Stormpath integration we have:

    I’ll cover most of these new features in upcoming blog posts, but for today let’s jump into string interpolation.

    String Interpolation

    This is one of those things that you know either by this term or another (substitution, filtering, etc). Basically, it comes down to evaluating a string containing one or more placeholders.
    For example, in the Java world, the most common placeholder looks something like this: ${keyName}. Many Java tools and libraries use this format: Apache Maven, Gradle, Spring, Groovy, etc.

    Enabling string interpolation in Apache Shiro is as simple as including a Maven dependency (or the equivalent in Gradle):

    <dependency>
        <groupId>org.apache.commons</groupId>
        <artifactId>commons-configuration2</artifactId>
        <version>2.1</version>
    </dependency>

    This step is not needed for the stormpath-shiro-servlet-plugin, as this dependency is included by default.

    Out of the box, system properties, environment variables, and Java constants are available for use.

    Using the following shiro.ini example, we can replace the environment specific values with place holders.

    [main]
    ds = com.mysql.jdbc.Driver
    ds.serverName = localhost
    ds.databaseName = db_name
    
    jdbcRealm = org.apache.shiro.realm.jdbc.JdbcRealm
    jdbcRealm.dataSource = $ds

    Using system properties dbDriver. dbHost, and dbName we end up with a shiro.ini that looks like:

    [main]
    ds = ${dbDriver}
    ds.serverName = ${dbHost}
    ds.databaseName = ${dbName}
    
    jdbcRealm = org.apache.shiro.realm.jdbc.JdbcRealm
    jdbcRealm.dataSource = $ds

    You can even set default value, since we are just using Apache Commons Configuration library,
    taking this example one step further we can also define a default value for dbDriver:

    [main]
    ds = ${dbDriver:-com.mysql.jdbc.Driver}
    ds.serverName = ${dbHost}
    ds.databaseName = ${dbName}
    
    jdbcRealm = org.apache.shiro.realm.jdbc.JdbcRealm
    jdbcRealm.dataSource = $ds

    Personally, I prefer system properties over environment variables (as they are not inherited by forked processes), but those would work too:

    [main]
    ds = ${DB_DRIVER:-com.mysql.jdbc.Driver}
    ds.serverName = ${DB_HOST}
    ds.databaseName = ${DB_NAME}
    
    jdbcRealm = org.apache.shiro.realm.jdbc.JdbcRealm
    jdbcRealm.dataSource = $ds

    Java constants are handled slightly different, they need to be prefixed with const, so for java.nio.charset.StandardCharsets.UTF_8, you end up with ${const:java.nio.charset.StandardCharsets.UTF_8}.

    Easy enough, now you can safely commit your shiro.ini file to source control and not worry about protecting your secrets or dealing with differences between environments.

    A couple final points:

    • Do not confuse the ${key} notation with Shiro’s $variable.
    • We recommend against storing passwords in System Properties.
      They work on the same concept, but the first example is a string, and the latter is an object defined in a shiro.ini, in the above example $ds is a JDBC DataSource.
    • If you are using the stormpath-shiro-servlet-plugin you can also use any of the stormpath.* properties, for an example, see the default config.

    The post String Interpolation with Apache Shiro appeared first on Stormpath User Identity API.

    Product Documentation vs. Knowledge Base

    $
    0
    0

    Determining the right location for information is one of the fundamental problems faced by application developers, technical writers, and customer support engineers. It’s no good having information that no one can find, and the first step in making sure your users find it is putting it somewhere they’re likely to look. At Stormpath, we’ve got two places where information lives: our Product Documentation and our Knowledge Base. These two live in different places, are largely maintained by different teams and until recently also contained a lot of the same information. This isn’t just a Stormpath problem, it’s something that crops up very often as a company grows.

    The overlap caused no end of confusion for customers and employees alike, as they might find the same information on two different pages explained in two different ways by two different people. It was also a maintenance nightmare since any update had to start with you finding all the places that something was documented in the first place.

    Product Documentation

    Our Product Documentation lives separately for a number of reasons. First, the Product team wanted to build multiple Product Guides, with custom code, pre-processors, if/then/else logic, and so on. All of this was possible with a combination of markup and static-site generators, but not possible with Zendesk.

    Secondly, we wanted consistency with the code documentation. A lot of our language-specific API documentation is generated out of the code using tools like Javadocs and Sphinx. It’s easy to make this documentation look and feel consistent using static-site generators.

    Finally, on a team of mostly developers, there’s an understandable preference for markup and plain text over a Zendesk/Wordpress/etc WYSWYG interface.

    Having established why we wanted separate Product Documentation, we came up with these goals and standards for it:

    • Product Documentation is built around product functionality (i.e. it’s PROactive)
    • Structure represents how a developer would actually need to learn about Stormpath and think about their own project. You could (if you wish) read it end-to-end.
    • Strives to cover all product functionality, but no more
    • Every feature should be documented, even if is rarely used
    • Doesn’t discuss external systems or integrations unless they are an official part of the product (e.g. no “unofficial workarounds”)

    What does or doesn’t belong in there is usually a game-time decision by the Product team, but in general: workarounds, specific instructions on using Stormpath with non-Stormpath tools or libraries, and short answers to specific questions all do not belong in Product Documentation.

    Knowledge Base

    The Knowledge Base (KB) is maintained by the Customer Success team, who work out of Zendesk. Zendesk came equipped with a KB feature, and it’s very nicely integrated into their question submission workflows. This means that if a customer wants to ask a question, they can be shown relevant KB articles. At the same time, Zendesk also provides search results to the Customer Success team when they’re answering a question, saving time that would otherwise be spent writing redundant answers. Sounds great, right?

    Of course what many teams quickly realize is that the road from a customer question to documentation is not at all straightforward. How can you tell if a customer’s question should be forwarded to the Product team and added to the documentation queue, or whether a simple KB article will do?

    So when we set about defining what rightly belongs in our Knowledge Base, this is what we came up with:

    • The Knowledge Base is built around customer inquiries (i.e. it’s REactive)
    • Its focus is troubleshooting
    • KB article topics come from common customer questions whose answers don’t belong in the Product Documentation (see above)
    • Every article is structured around a Problem and a Solution

    Every Knowledge Base should aim to help a customer with a specific problem get an answer as quickly as possible.

    This is all of course still evolving, but we’re happy with the division we’ve arrived at. Product Documentation aims to be comprehensive, structured, and accurate. Everything that you would want from the canonical source of information for your product. The Knowledge Base doesn’t aim to be comprehensive, it contains little-to-no structure in and of itself, and the articles are short and to the point.

    The next goal is ensuring that customers are able to find the information in both sources as quickly and intuitively as possible. The knowledge management challenges never end!

    Questions? Opinions? Leave them in the comments below, and don’t forget to check out my post on why video documentation is never the right answer!

    The post Product Documentation vs. Knowledge Base appeared first on Stormpath User Identity API.

    Hallo Deutschland! Stormpath Launches European Enterprise Region

    $
    0
    0

    Today we’re excited to announce the launch of Enterprise Cloud service in Europe! Our new EU region will help European customers keep their user data wholly in-region. We’ve selected the AWS data storage center in Frankfurt, Germany as Germany has some of the strictest data privacy regulations in the EU. This will not only improve login latency for EU-based end users but will also help our European customers comply with EU privacy and data isolation regulations.

    EU User Data Isolation

    Our new EU region offers all the features of the public Stormpath API, including user management, token authentication, multi-tenancy, multi-factor authentication, and authorization. As an enterprise customer, you can choose either the Stormpath US or EU Enterprise Clouds — or both! — to run Stormpath on infrastructure dedicated to your region. This provides powerful global identity management with isolated, geo-located infrastructure.

    Get Started

    If you have an EU-based project and want to know more about the Stormpath Europe Enterprise Region, our sales team is happy to walk you through the details.

    If you’re new to Stormpath, we offer the same core API functionality across all our environments. You can test drive Stormpath and our integrations with a developer account on our Public API at any time, for free. Get Started Now!

    Why Stormpath for User Management?

    With Stormpath, you can launch your application faster with industry-leading security, allowing you to focus on the core features that will make your project a success.

    “The goal of Stormpath is to free up developers’ time so they can focus on what really matters to their product and business,” said Les Hazlewood, co-founder and CTO. “Managing users in the cloud is complicated and risky; we can put years of security expertise and best practice in their applications in less time than it takes to make coffee.”

    “Stormpath is at the forefront of application security as a service,” says Ross Mason, founder of Mulesoft. “Since every application requires user security, this service reduces the infrastructure burden, making it possible to avoid rolling the same code over and over.”

    EU-US Privacy Shield

    Stormpath has long been proud to comply with the EU-US Privacy Shield Framework as set forth by the US Department of Commerce regarding the collection, use, and retention of personal information from European Union member countries when their data resides in our US region.

    Stormpath has certified that it adheres to the Privacy Shield Principles of Notice, Choice, Accountability for Onward Transfer, Security, Data Integrity and Purpose Limitation, Access, and Recourse, Enforcement, and Liability.

    As always, we’re happy to answer any questions you might have, or get you started on our new EU Cloud. You can reach out to us on Twitter @gostormpath or connect via email.

    The post Hallo Deutschland! Stormpath Launches European Enterprise Region appeared first on Stormpath User Identity API.

    5 Must-Have Visual Studio Code Extensions

    $
    0
    0

    Visual Studio Code is Microsoft’s fully cross-platform IDE. It is beautiful, easy to use and lightweight. Its slimness can be attributed to the fact that it does only the basics by default, and add functionality via extensions. This lets developers start with the basics, and add only those things they really need. There are some Visual Studio Code extensions however, that I recommend to developers right off the bat. These are my top five.

    1. EditorConfig for VS Code

    This extension is like sharing your Visual Studio settings file with anyone who gets your code. In the root of the application is a .editorconfig file that lets you set certain code style guidelines. Things like whether your indents are tabs or spaces (and how many spaces a tab should be). This can cut down on friction when working with a team and for consultants when switching between projects for multiple clients.

    2. Auto-Open Markdown Preview

    Whether creating repository README files or using it for your blog, markdown is an easy way for developers to document applications and get great code highlighting. It can be frustrating when writing markdown documents however, having to constantly compile them to preview the output. This extension opens markdown documents in a split pane with the markdown on one side and the preview on the other. This makes writing markdown documents almost seamless.

    3. Git History (git log)

    For those developers using Git for source control, git log can be helpful, but getting a good readable git log takes a plethora of switches (color, pretty, etc.). The Git History extension gives developers that same easy-to-read log with just a few key chords, just like the one below.

    image00

    4. Docker Support

    The Docker Support plugin gives you code highlighting, snippets and intellisense for your dockerfile. It has been immensely helpful as I learn to use Docker and as I continue in to Docker Compose and Docker Swarms.

    5. ES Lint

    No development would be complete without some JavaScript these days. If you’re going to write JavaScript (and you probably are), you might as well write it well (or at least following the community-agreed conventions. The ES Lint extension reads your .eslintrc file and checks your code as you write it and provides feedback within the IDE, with little red and yellow squiggles under the warnings and mistakes.

    What’s Missing?

    I still do a lot of .NET development, and VS Code is great for .NET Core development. I would love to see an extension that gives me the Add New Controller item and scaffolds out the code with the filename and the namespace. On the plus side, I can write my own extensions!

    Any extensions that you like? Ones that you find yourself using all the time? Let me know in the comments below, or hit me up on Twitter @leebrandt! Also, check out these awesome resources to learn more about how Stormpath can help you never build auth again:

    The post 5 Must-Have Visual Studio Code Extensions appeared first on Stormpath User Identity API.

    Build a REST API for your Mobile Apps with ASP.NET Core

    $
    0
    0

    Nowadays, RESTful APIs are the standard way of exposing backends to applications.
    They allow you to share your business logic between different clients with a low level of coupling through a super-standardized protocol: HTTP.

    One of the biggest challenges when building REST API is authentication. Typically, we manage this with JWTs. Unfortunately, ASP.NET Core doesn’t fully support this out-of-the-box.

    The good news is that the Stormpath ASP.NET Core library allows us to add JWT authentication to any API with minimal configuration.

    In this tutorial, we will create a REST API in ASP.NET Core to manage a list of books.
    Our example API will allow users to register and login to manage their books. This simple project could be the base for a future social media application that connects readers and supports book reviews.
    The source code is available on GitHub, so feel free to check the finished code and play with it.

    Let’s get started!

    Create the Web API Project

    Open up Visual Studio, and create a new ASP.NET Core Web Application project.

    booksAPI project

    Select the “Web API” template and make sure authentication is set to “No Authentication”.

    booksAPI project

    Now, we are going to create our Book model. Add a folder named “Models” at the root of the project, and then inside of it create the Book class:

    public class Book
    {
       public int Id { get; set; }
       public string Title { get; set; }
       public string Author { get; set; }
       public DateTime PublishedDate { get; set; }
    }

    Create a new folder named “Services”. To save our books, we are going to use a useful new great feature available in EF Core: the in-memory data provider. This feature is awesome because we don’t have to spend time setting up a database to test our API. Later on, we can easily swap this provider with one that uses a persistent storage like a database, for example.

    If you’re interesting in exploring EF Core as an in-memory data provider further, check out Nate’s article on the subject!

    Set Up Entity Framework Core

    Right-click on your project and select “Manage NuGet packages”. Then, add the package Microsoft.EntityFrameworkCore.InMemory

    Install EF Core

    Add the BooksAPIContext class inside the Models folder, which will implement the DbContext class and will be responsible for the interactions between our application and the data provider.

    public class BooksAPIContext : DbContext
    {
        public BooksAPIContext(DbContextOptions options) : base(options)
        {
        }
    
    
        public DbSet<Book> Books { get; set; }
    }

    On the ConfigureServices method of the Startup class, we are going to configure our context to use the in-memory data provider:

    // This method gets called by the runtime. Use this method to add services to the container.
    public void ConfigureServices(IServiceCollection services)
    {
       services.AddDbContext<BooksAPIContext>(x => x.UseInMemoryDatabase());
       // Add framework services.
       services.AddMvc();
    }

    Create the IBookRepository interface inside of the Services folder:

    public interface IBookRepository
    {
       Book Add(Book book);
       IEnumerable<Book> GetAll();
       Book GetById(int id);
       void Delete(Book book);
       void Update(Book book);
    }

    And a concrete InMemoryBookRepository class that will use the BookAPIContext to interact with the in-memory database:

    public class InMemoryBookRepository : IBookRepository
    {
        private readonly BooksAPIContext _context;
    
    
        public InMemoryBookRepository(BooksAPIContext context)
        {
            _context = context;
        }
    
    
        public Book Add(Book book)
        {
            var addedBook = _context.Add(book);
            _context.SaveChanges();
            book.Id = addedBook.Entity.Id;
    
    
            return book;
        }
    
    
        public void Delete(Book book)
        {
            _context.Remove(book);
            _context.SaveChanges();
        }
    
    
        public IEnumerable<Book> GetAll()
        {
            return _context.Books.ToList();
        }
    
    
        public Book GetById(int id)
        {
            return _context.Books.SingleOrDefault(x => x.Id == id);
        }
    
    
        public void Update(Book book)
        {
            var bookToUpdate = GetById(book.Id);
            bookToUpdate.Author = book.Author;
            bookToUpdate.Title = book.Title;
            bookToUpdate.PublishedDate = book.PublishedDate;
            _context.Update(bookToUpdate);
            _context.SaveChanges();
        }
    }

    Don’t forget to register the repository as an injectable service within the ConfigureService in the Startup class:

    // This method gets called by the runtime. Use this method to add services to the container.
    public void ConfigureServices(IServiceCollection services)
    {
       services.AddDbContext<BooksAPIContext>(x => x.UseInMemoryDatabase());
       services.AddTransient<IBookRepository, InMemoryBookRepository>();
       // Add framework services.
       services.AddMvc();
    }

    Now that you have set up your data layer let’s dive into the Web API controller!

    Create the Note Web API Controller

    Before going any further, make sure to delete the boilerplate ValuesController that the framework created automatically. Also, modify the launchSettings.json file and make sure the launch URL of the profile you are using is pointing to a valid URL. In this example, we will point to our book controller:

    {
      "iisSettings": {
        "windowsAuthentication": false,
        "anonymousAuthentication": true,
        "iisExpress": {
          "applicationUrl": "http://localhost:63595/",
          "sslPort": 0
        }
      },
      "profiles": {
        "IIS Express": {
          "commandName": "IISExpress",
          "launchBrowser": true,
          "launchUrl": "book",
          "environmentVariables": {
            "ASPNETCORE_ENVIRONMENT": "Development"
          }
        },
        "BooksAPI": {
          "commandName": "Project",
          "launchBrowser": true,
          "launchUrl": "http://localhost:5000/book",
          "environmentVariables": {
            "ASPNETCORE_ENVIRONMENT": "Development"
          }
        }
      }
    }

    Create a Web API Controller Class in the Controllers folder and name it BookController.

    The auto-generated code will look like this:

    [Route("api/[controller]")]
    public class BookController : Controller
    {
        // GET: api/values
        [HttpGet]
        public IEnumerable<string> Get()
        {
            return new string[] { "value1", "value2" };
        }
    
    
        // GET api/values/5
        [HttpGet("{id}")]
        public string Get(int id)
        {
            return "value";
        }
    
    
        // POST api/values
        [HttpPost]
        public void Post([FromBody]string value)
        {
        }
    
    
        // PUT api/values/5
        [HttpPut("{id}")]
        public void Put(int id, [FromBody]string value)
        {
        }
    
    
        // DELETE api/values/5
        [HttpDelete("{id}")]
        public void Delete(int id)
        {
        }
    }

    The framework automatically generated a lot of code for us. There is a method for each HTTP verb that our controller will handle. As a refresher, the REST API standard uses each HTTP verb for a different action over our resources:

    aspnet-mobile-api-table

    You will also see the controller has a Route attribute, with the value api/[controller]. This defines the base route for all of this controller endpoints, which in this case is api/book. We will change this to make the base route book alone. It should look like this:

    Route(“[controller]”)

    This attribute can also be applied at method-level if you need to define custom routes for a specific endpoint.

    The Get method (as well as the Put and Delete methods) have in their HTTP verb Attribute an “id” element:

    HttpGet("{id}")

    This is a placeholder for the “id” parameter in the URL of the endpoint. For example, for the Get method, the URL will be:

    /book/{id}

    The framework automagically maps the parameters defined in these attributes to the parameters of the method in the controller. Awesome, huh?

    We will now write the code to handle each request to our API:

    [Route("[controller]")]
    public class BookController : Controller
    {
        private readonly IBookRepository _bookRepository;
    
    
        public BookController(IBookRepository bookRepository)
        {
            _bookRepository = bookRepository;
        }
    
    
        // GET: book
        [HttpGet]
        public IEnumerable<Book> Get()
        {
            return _bookRepository.GetAll();
        }
    
    
        // GET book/5
        [HttpGet("{id}", Name = "GetBook")]
        public IActionResult Get(int id)
        {
            var book = _bookRepository.GetById(id);
            if (book == null)
            {
                return NotFound();
            }
    
            return Ok(book);
        }
    
    
        // POST book
        [HttpPost]
        public IActionResult Post([FromBody]Book value)
        {
            if (value == null)
            {
                return BadRequest();
            }
            var createdBook = _bookRepository.Add(value);
    
    
            return CreatedAtRoute("GetBook", new { id = createdBook.Id }, createdBook);
        }
    
    
        // PUT book/5
        [HttpPut("{id}")]
        public IActionResult Put(int id, [FromBody]Book value)
        {
            if (value == null)
            {
                return BadRequest();
            }
    
    
            var note = _bookRepository.GetById(id);
    
    
            if (note == null)
            {
                return NotFound();
            }
    
    
            value.Id = id;
            _bookRepository.Update(value);
    
    
            return NoContent();
    
    
        }
    
    
        // DELETE book/5
        [HttpDelete("{id}")]
        public IActionResult Delete(int id)
        {
            var book = _bookRepository.GetById(id);
            if (book == null)
            {
                return NotFound();
            }
            _bookRepository.Delete(book);
    
    
            return NoContent();
        }
    
    
    }

    Now we’re ready to test our API!

    Add JWT authentication using Stormpath

    So far this is a totally public API, so any user can get, create, edit, and delete any book they want. That’s not very secure! We will now add authentication to our API through JWT tokens.

    If you want to refresh your knowledge, check out our overview of token authentication and JWTs!

    As today, ASP.NET Core supports protecting routes with Bearer header JWTs. But, unlike the ASP.NET 4.x Web API framework, it doesn’t have support for issuing them. To do this, you will need to write custom middleware or use external packages. There are several options; you can read Nate’s article to learn more about this.

    Lucky for us, Token Authentication with JWT becomes extremely easy using the Stormpath ASP.NET Core library – I’ll show you how.

    Get your Stormpath API credentials

    To communicate with Stormpath, your application needs a set of API Keys. Grab them from your Stormpath account (If you haven’t already registered for Stormpath, you can create a free developer account here).
    Once you have them, you should store them in environment variables. Open up the command line and execute these commands:

    setx STORMPATH_CLIENT_APIKEY_ID "<your_api_key_id>"
    setx STORMPATH_CLIENT_APIKEY_SECRET "<your_api_key_secret>"

    Restart Visual Studio to pick up the environment variables from your OS.

    Integrate Stormpath with the Web API

    Right-click on your project and select “Manage NuGet packages”. Them, add the package Stormpath.AspNetCore.

    Install Stormpath

    To use Stormpath API for Access Token authentication, add this configuration in the ConfigureServices method in the Startup.cs.

    // This method gets called by the runtime. Use this method to add services to the container.
    public void ConfigureServices(IServiceCollection services)
    {
        services.AddStormpath(new StormpathConfiguration()
        {
            Web = new WebConfiguration()
            {
                // This explicitly tells the Stormpath middleware to only serve JSON responses (appropriate for an API).
                // By default, HTML responses are served too.
                Produces = new[] {"application/json"},
                Oauth2 = new WebOauth2RouteConfiguration()
                {
                    Uri = "/token",
                }
            }
        });
    ...
    }

    As a personal preference, I changed the default token endpoint URI ("/oauth/token") to /token.
    Options that are not overridden by explicit configuration will retain their default values.

    Make sure you add the Stormpath middleware before any middleware that requires protection, such as MVC.

    You can [learn more about configuration options in the Stormpath Product Documentation]((https://docs.stormpath.com/dotnet/aspnetcore/latest/configuration.html#configuration).

    Now, find the Configure method and add Stormpath to your middleware pipeline.

    // This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
    public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
    {
        loggerFactory.AddConsole(Configuration.GetSection("Logging"));
        loggerFactory.AddDebug();
        app.UseStormpath();
        app.UseMvc();
    }

    Finally, to effectively protect the controller, add the Authorize attribute:

    [Authorize]
    [Route("[controller]")]
    public class BookController : Controller
    {
        ...
    }

    That’s all! Exhausted yet? 😉

    Test Your Web API with Postman

    Now, let’s test our Web API. I’m using Postman for this tutorial, but feel free to use any REST client you like. To register a new user, we need to make a POST request to the /register endpoint, passing the required data on the body:

    { "givenName": "MyAPIUser", "surname": "Tester", "email": "test3@example.com", "password": "TestTest1" }

    Register Stormpath user

    With our user, we are going to get a token by POSTing to the /token endpoint.
    Payload should be a URL-encoded form with your credentials:

    grant_type: password
    username: test3@example.com
    password: TestTest1

    Get your Bearer Token

    The access token you received should be sent to the server on every request, on the Authorization header. The value of the header should be Bearer <your_token>:

    Use your Bearer Token

    Let’s add a new book:

    Add a book

    The status code 201 indicates our application has created a new book successfully.

    If you view your list of books now, you should see the one you just created has been added:

    Get books

    Congratulations! You just created a web API with ASP.NET Core. 🙂

    Learn More

    The post Build a REST API for your Mobile Apps with ASP.NET Core appeared first on Stormpath User Identity API.

    Tips and Tricks for AngularJS and Spring Boot with Stormpath

    $
    0
    0

    In October, I showed you how to integrate AngularJS, Spring Boot, and Stormpath. As part of that tutorial, I demonstrated how to use our AngularJS SDK to create registration, login and forgot password screen. I also showed how to configure Spring Boot to allow cross-domain requests.

    Today, I’ll show you how to 1) restrict access to functionality with our AngularJS SDK and 2) use our Spring Boot Starter v1.2.0 to simplify HTTP access control (CORS). Restricting access allows you to show different sections of your UI to different sets of users. With our CORS configuration, the console error you get when logging out goes away. Taken together, these tips can increase security and improve the user experience for your application.

    Restrict Access with Stormpath’s AngularJS SDK

    Stormpath’s AngularJS SDK can be used to prevent access to certain states and hide links for different groups. On the Java side, you can use Spring Security to control access by group membership. To begin, clone the project I created for the last tutorial:

    git clone https://github.com/stormpath/stormpath-angularjs-spring-boot-example.git \
    stormpath-angularjs-access-control-cors-example

    To run the Spring Boot backend, execute mvn spring-boot:run. To run the AngularJS frontend, in another terminal window, execute npm install && gulp.

    In the last tutorial, I showed how you can use if-user and if-not-user directives to show/hide elements when users are logged in/logged out.

    In addition, you can protect states in their ui-router configuration. For example, “view2” and “search” are not protected, so it should be possible to navigate to them directly with http://localhost:3000/view2 and http://localhost:3000/search. However, when you try these, you’ll see an error in your browser.

    Cannot GET /view2/

    This is not caused by Angular or Stormpath, but rather Browsersync’s configuration. It’s not configured to work with AngularJS when it’s in HTML5 mode. The good news is it’s easy to fix.

    First, install connect-history-api-fallback.

    npm install connect-history-api-fallback --save-dev

    Then modify gulpfile.js to import it and change Browsersync to use it.

    var historyApiFallback = require('connect-history-api-fallback');
    ...
    gulp.task('serve', function() {
    
        browserSync.init({
            server: {
              baseDir: './app',
              middleware: [ historyApiFallback() ]
            }
        });

    After making these changes, you can navigate to http://localhost:3000/view2 or http://localhost:3000/search.

    view2 unprotected

    search unprotected

    Using Stormpath’s AngularJS SDK, configure “view2” so it requires the user to be authenticated and make it so “search” can only be accessed by users in the “ADMIN” group. But first, upgrade to the latest version by running:

    bower install

    Open app/view2/view2.js and use Stormpath’s State Config to ensure the user is authenticated.

    .config(['$stateProvider', function($stateProvider) {
      $stateProvider.state('view2', {
        url: '/view2',
        templateUrl: 'view2/view2.html',
        controller: 'View2Ctrl',
        sp: {
          authenticate: true
        }
      });
    }])

    After making this change, when you try to navigate to http://localhost:3000/view2, you’ll be redirected to login. If it works, good job!

    To limit the “search” state to the “admin” group, modify app/search/search.state.js:

    function stateConfig($stateProvider) {
      $stateProvider
        .state('search', {
          url: '/search',
          templateUrl: 'search/search.html',
          controller: 'SearchController',
          controllerAs: 'vm',
          sp: {
            authorize: {
              group: 'admin'
            }
          }
        });
    }

    TIP: You can also use the data.authorities to specify groups. We added support for this configuration because this is the configuration that JHipster expects. It also allows you to configure multiple groups.

    function stateConfig($stateProvider) {
      $stateProvider
        .state('search', {
          url: '/search',
          templateUrl: 'search/search.html',
          controller: 'SearchController',
          controllerAs: 'vm',
          data: {
            authorities: ['admin']
          }
        });
    }

    After making this change, you’ll need to perform two additional tasks to make it work. First, you’ll need to create a group named “admin” in the admin console. The abbreviated steps are as follows:

    1. Create a new Application
    2. Create a new Group called “admin” for the Application
    3. Create a new Account in the admin Group
    4. Create a new Account NOT in the admin Group

    Click here for a visual walkthrough of this process.

    Next, you’ll need to configure Spring Boot to show group information when calling the /me endpoint. Modify src/main/resources/application.properties to expand groups.

    stormpath.web.me.expand.groups = true

    Also, modify pom.xml so you’re using the latest release of Stormpath’s Spring Boot support.

    ...
        
            org.springframework.boot
            spring-boot-starter-parent
            1.4.2.RELEASE
            
        
    ...
            
                com.stormpath.spring
                stormpath-default-spring-boot-starter
                1.2.1
            
    ...

    Spring Boot’s developer tools and Browsersync will restart everything for you. If you see the following error in your browser console, restart Spring Boot manually.

    TypeError: Cannot read property 'filter' of undefined

    Now if you try to navigate to http://localhost:3000/search as a non-admin user, you won’t be able to. If you login as an admin user, you’ll be able to see the screen. If you’d like to alert the user that they’ve been denied access to this state, you can use something like the following:

    $scope.$on('$stateChangeUnauthorized', function () {
      $state.go('accessdenied');
    });

    Of course, you’ll need to define the accessdenied state for this to work.

    The final step is to hide the “search” link for non-admin users. Open app/index.html and change the list item for the search link to the following:

  • search
  • Now that you’ve learned how to control access with AngularJS and Stormpath, let’s look at simplifying the CORS configuration for Spring Boot.

    Easy CORS Configuration with Stormpath’s Spring Boot Starter

    Stormpath Java SDK v1.2.0 adds support for CORS entirely by configuration using the following properties.

    stormpath.web.cors.enabled
    stormpath.web.cors.allowed.originUris
    stormpath.web.cors.allowed.headers
    stormpath.web.cors.allowed.methods

    You can see their default settings in web.stormpath.properties.

    The only property you need to override is stormpath.web.cors.allowed.originUris, because it’s blank by default. Add this property to src/main/resources/application.properties and set it to http://localhost:3000. Remove stormpath.web.stormpathFilter.order=1, while you’re in there.

    spring.data.rest.basePath=/api
    stormpath.web.me.expand.groups = true
    stormpath.web.cors.allowed.originUris = http://localhost:3000

    In addition, remove the corsFilter bean defined in src/main/java/com/example/SecurityConfiguration.java.

    Along with simplifying CORS configuration, this also solves an issue that the previous Spring-based configuration had, where clicking the Logout link displayed an error in the browser console.

    logout cors error

    Learn More about AngularJS and Stormpath

    This article showed you how to limit access to states and hide sections of your AngularJS UI with Stormpath’s AngularJS SDK. It also showed you how the 1.2.0 release of our Java SDK greatly simplifies CORS configuration.

    You can find the source code for this example application on GitHub. You can read the following commits to see how each section is implemented.

    1. Make Browsersync work with AngularJS’s HTML5 mode
    2. Limit access to states and hide links with Stormpath’s AngularJS SDK
    3. Easy CORS configuration for Spring Boot

    If you’d like to see more tips and tricks with Stormpath’s AngularJS or Java SDKs, please let me know in the comments below.

    Related:

    The post Tips and Tricks for AngularJS and Spring Boot with Stormpath appeared first on Stormpath User Identity API.

    Tutorial: Build a Spring Boot Application with React and User Authentication

    $
    0
    0

    Previously you created a CRUD application using Spring Boot, React and Stormpath where React handled the data view and the Stormpath Spring Boot Starter set up the login and registration pages. Now you’ll see how to use Stormpath’s React SDK to create login and signup pages manually so that every view on your site is managed by you.

    Re-wire Stormpath and React

    The Stormpath Spring Boot Starter sets up local server endpoints such as /login and /register that both serve up the front-end templates (for get requests) and receive back-end API requests (for post and put). You need to wire the get endpoints to your own React-based HTML. You will use the React Router to handle each URL, and the pages themselves will call the starter post and put endpoints when logging in, registering etc.

    Configure Spring Boot and Stormpath

    Start with a clone of the repository from a follow-on tutorial (which added webpack functionality):

    git clone https://github.com/stormpath/stormpath-spring-boot-webpack-example

    Strip out the security configuration by removing any Stormpath-related lines from application.properties. The file should then read

    spring.data.rest.basePath=/api

    Now rename the endpoints you want to override for the front-end so that your URLs don’t clash with the Spring Boot Starter.

    spring.data.rest.basePath=/api
    stormpath.web.login.uri=/signin
    stormpath.web.register.uri=/signup
    stormpath.web.forgotPassword.uri=/lostpass

    Note: see the Stormpath documentation for a full list of these settings.

    Remove the Stormpath code from configure() in Security.java.

    @Configuration
    public class Security extends WebSecurityConfigurerAdapter {
        @Override
        protected void configure(HttpSecurity http) throws Exception {
        }
    }

    Install the project libraries and run webpack.

    npm install
    webpack 
    mvn spring-boot:run

    If you navigate to localhost:8080 you should see a table with some rows. This is what the user should see after they’ve logged in.

    React Starter

    Add Security with Stormpath for Authentication

    Add the Stormpath-related lines back to application.properties.

    stormpath.application.href = ...
    stormpath.client.apiKey.id = ...
    stormpath.client.apiKey.secret = ...

    Note: For security reasons you should never store your Stormpath keys inside of project files. Rather, use environment variables or an apiKey.properties file.

    Add the Stormpath lines back to Security.java but change it to allow anyone access the homepage, public files, and the URLs for the user management pages.

    @Configuration
    public class Security extends WebSecurityConfigurerAdapter {
        @Override
        protected void configure(HttpSecurity http) throws Exception {
            http.apply(stormpath()).and()
            .authorizeRequests()
            .antMatchers("/","/public/**","/login","/register","/forgot").permitAll();
        }
    }

    Restart and you’ll see the table header and logout button but without any table rows. You’ve allowed anyone to view the main page but there isn’t any data coming through.

    Network Failure

    If you open the network console you’ll see what is happening—an authorization failure occurred on the /api/employees REST endpoint. You need to set up React to redirect to a login page when an authorization failure like this happens.

    Create Routes in React

    The Stormpath React SDK uses React Router to specify pages like login. Add these libraries to your project:

    npm install --save react-stormpath react-router

    Now add the following imports at the top of your app.js entry.

    import ReactStormpath, { Router, AuthenticatedRoute, LoginRoute } from 'react-stormpath';
    import { Route, browserHistory } from 'react-router';

    These define which React components represent the login page, etc., and which URLs require authentication. Next initialize the SDK with the following after your imports:

    ReactStormpath.init({
    endpoints: {
      login: '/signin',
      register: ‘/signup’,
      forgotPassword: ‘/lostpass’
      }
    });

    The init accepts various settings to configure the SDK. Here the management endpoint URLs are changed to match those specified in application.properties. Change the render command at the bottom of app.js to the following:

    ReactDOM.render(
      
        
        
      ,
      document.getElementById('root')
    );

    You’ve specified that the root element must render from Router (a Stormpath component) which uses React Router’s browserHistory plugin to manage URLs. Then AuthenticateRoute ensures that any visits to the homepage (/) need to be authenticated and also rendered with the MainPage component (you’ll derive this from the previous interface). Finally the LoginRoute makes sure that any login requirements redirect to /login and are rendered with a component called LoginPage (which you still need to define).

    Create Pages in React

    Put each page into a separate file. Inside of src create a folder called pages.

    Pages Folder

    Pull the previous components out of app.js and export them from MainPage.js.

    import React from 'react';
    import $ from 'jquery';
    import toastr from 'toastr';
    
    var Employee = React.createClass({
    
      getInitialState: function() {
        return {display: true };
      },
      handleDelete() {
        var self = this;
        $.ajax({
            url: self.props.employee._links.self.href,
            type: 'DELETE',
            success: function(result) {
              self.setState({display: false});
            },
            error: function(xhr, ajaxOptions, thrownError) {
              toastr.error(xhr.responseJSON.message);
            }
        });
      },
      render: function() {
    
        if (this.state.display==false) return null;
        else return (
          
              {this.props.employee.name}
              {this.props.employee.age}
              {this.props.employee.years}
              
                
              
          
        );
      }
    });
    
    var EmployeeTable = React.createClass({
    
      render: function() {
    
        var rows = [];
        this.props.employees.forEach(function(employee) {
          rows.push(
            );
        });
    
        return (
          
    {rows}
    Name Age Years Delete
    ); } }); var MainPage = React.createClass({ loadEmployeesFromServer: function() { var self = this; $.ajax({ url: "http://localhost:8080/api/employees", }).then(function(data) { self.setState({ employees: data._embedded.employees }); }); }, getInitialState: function() { return { employees: [] }; }, componentDidMount: function() { this.loadEmployeesFromServer(); }, render() { return (
    ); } }); export default MainPage;

    Note: the main page now includes the logout button. This simplifies index.html, our single page app holder, significantly.

    
    
        React + Spring
        
    
    
        

    Now create the login page in LoginPage.js.

    import React from 'react';
    import DocumentTitle from 'react-document-title';
    import { LoginForm } from 'react-stormpath';
    
    var LoginPage = React.createClass({
    
      render() {
        return (
          
            

    Login


    ); } }); export default LoginPage;

    Here you’re using the React Document Title library which you need to install. It allows you to set the browser title whilst using React Router.

    npm install --save react-document-title

    Now import the pages into your app.js entry.

    import MainPage from './pages/MainPage';
    import LoginPage from './pages/LoginPage';

    app.js now looks like the following.

    import React from 'react';
    import ReactDOM from 'react-dom';
    
    import ReactStormpath, { Router, AuthenticatedRoute, LoginRoute } from 'react-stormpath';
    import { Route, browserHistory } from 'react-router';
    
    import 'bootstrap/dist/css/bootstrap.css';
    import 'toastr/build/toastr.css';
    
    import MainPage from './pages/MainPage';
    import LoginPage from './pages/LoginPage';
    
    ReactStormpath.init({
      endpoints: {
        login: '/signin',
        register: ‘/signup’,
        forgotPassword: ‘/lostpass’
      }
    });
    
    ReactDOM.render(
      
        
        
      ,
      document.getElementById('root')
    );

    Reroute Through Spring Boot

    React Router artificially alters the URL to /login, /register,etc., but all requests still come through Spring Boot (i.e. server). You need to tell Spring boot to send those requests to our SPA in case someone refreshes the browser window. Change HomeController.java to the following.

    package tutorial;
    
    import org.springframework.stereotype.Controller;
    import org.springframework.web.bind.annotation.RequestMapping;
    import org.springframework.web.bind.annotation.RequestMethod;
    
    @Controller
    public class HomeController {
    
        @RequestMapping(value = {"/","/login","/register","/forgot"})
        public String index() {
            return "index";
        }
    }

    Start Your Site

    After running Webpack and refreshing you should be sent to the login page you just created.

    Custom Login

    Once you put in correct details you’ll see the data table populated as before.

    Add Signup and Password Reset

    There are two other pages you should specify – registration, and the forgot password screen. The first you can add using an ordinary React Router Route.

    import RegisterPage from './pages/RegisterPage';
    ...
    
    

    For the page itself, use the one from the React SDK example project.

    import React from 'react';
    import DocumentTitle from 'react-document-title';
    
    import { RegistrationForm, LoginLink } from 'react-stormpath';
    
    var RegisterPage = React.createClass({
      render() {
        return (
          
            

    Registration


    Your account has been created. Login Now.

    Your account has been created. Please check your email for a verification link.

    Back to Login

    ); } }); export default RegisterPage;

    Add a link to the login page to /register.


    Register

    Now, when you click the link you should see a signup page.

    Custom Signup

    Add Forgot Password

    Lastly, the forgot password page is configured in much the same way.

    import React from 'react';
    import DocumentTitle from 'react-document-title';
    
    import { ResetPasswordForm } from 'react-stormpath';
    
    var ForgotPage = React.createClass({
      render() {
        return (
          
            

    Forgot Password


    ); } }); export default ForgotPage;

    import ForgotPage from './pages/ForgotPage';
    …
    
    

    Learn More!

    Just like that you have created your own React user management components and tied them to the Spring Boot backend. You can clone the full source code on Github. To go further, check out the Stormpath React SDK API for more details or check out this blog post on how to create custom forms with React.

    The post Tutorial: Build a Spring Boot Application with React and User Authentication appeared first on Stormpath User Identity API.


    Add Facebook Login to Your Existing React Application

    $
    0
    0

    According to Facebook, tens of millions of people use their platform to log into applications all over the web. By integrating Facebook Login with your web application, not only do users get the social capabilities they’ve come to expect, but also the confidence and convenience of reusing their credentials securely.

    Users also store a lot of personal information on their social media profiles. Linking their profile with your application means that they don’t have to tediously re-enter all of their data and they have fine-grained control over what access applications have. It’s a win-win for both users and developers to have an easy and trustworthy authentication platform like Facebook Login. So now that you know why you should be using Facebook Login in your application, let’s look at how to use Stormpath to make the development even easier.

    Stormpath is a plug-and-play backend for your web applications. It handles user and data management so that you can focus on the user experience. Stormpath also offers built-in social integration with Facebook and Google.

    Facebook Login via Stormpath is super easy to use and to make this demonstration even easier, we’ll be using Stormpath’s React project to get up and running quickly. Here’s how it’ll work:

    Stormpath - Facebook - React

    We’ll be using Stormpath’s React SDK with Express as our web server. On top of that we’ll add Facebook’s SDK to log users into the application and provide account data. For those playing along at home, make sure you have the following software installed:

    Let’s get started!

    Create a Facebook Application

    Before you write any code, first make sure that you’ve registered a Facebook application. When users log in, they will be prompted to grant your application permission to get their basic account information. If you don’t already have one, here’s how to create a Facebook application:

    1. Go to https://developers.facebook.com/apps/
    2. Click “Add a new app” and enter your app info
    3. Go to the Settings > Basic page
    4. Click “Add Platform”, select “Website”, and enter the URL of your website as well as the URL of the development version of our app, which will be http://localhost:3000

    Facebook application settings

    When you’re done, just leave the settings page aside as we’ll need your app info in the steps to follow.

    Configure Your Stormpath Account

    Now it’s time to configure your Stormpath account to talk with the Express server and Facebook’s API. If you don’t have an account, you can get a free account at https://api.stormpath.com/register.

    C

    From your Stormpath Admin Console, go to the Directories tab and create a new directory. A Stormpath directory is a way to group accounts so that you have more granular control over things like security policies. Name the new directory “Facebook Login” and choose “Facebook” from the list of types. Two more fields will be added where you can enter your Facebook app ID and secret from the previous section.

    The next thing you need to do is to create an API key so the client can authenticate itself with your Stormpath account. The Stormpath documentation includes a detailed walkthrough of creating an API key, but here’s the abridged version:

    1. From the Home tab of the Admin Console select Manage API Keys under the Developer Tools heading
    2. Click the Create API Key button to trigger a download of a apiKey-{API_KEY}.properties file
    3. (Optional) Move the file to a ~/.stormpath directory

    You’ll also need to get the API endpoint for our application. This can be found by going to the Applications tab of the Admin Console and expanding the details of the application. Note that “My Application” is automatically created for you, so feel free to use that for demonstration purposes. The value you’re looking for is labelled HREF in the application’s Details tab. Keep this handy for the next section.

    That takes care of all the necessary prep work. Now you’re ready to start building the application.

    Add Facebook to Your React+Stormpath Application

    Everything you need to get an application running with Stormpath, React, and Facebook is in the stormpath-express-react-example repository on GitHub. You can install it locally with a few simple commands:

    git clone git@github.com:stormpath/stormpath-express-react-example.git stormpath-react-fb
    cd stormpath-react-fb
    npm install

    This script will clone the git repository into a folder named stormpath-react-fb, move into that folder, and install all of the dependencies.

    Configure the Application

    For your application to work, it needs to know how to phone home to the Stormpath backend. To do that, you’ll set up a small configuration file named stormpath-react-fb/stormpath.yml and enter the data you set aside. This is how the file should be organized:

    client:
      apiKey:
        id: [1]
        secret: [2]
    application:
      href: [3]

    Fill in the placeholders with your account information:

    1. apiKey.id from the Stormpath API key file
    2. apiKey.secret from the Stormpath API key file
    3. HREF from the application details

    The application is runnable now, but you still need to do the most important part: adding the Facebook Login button!

    Facebook Integration

    You’ll be substituting the app’s own login prompt with the Facebook Login page — start by editing the src/js/pages/LoginPage.js file.

    Because Stormpath has built-in Facebook Login integration, all you need to do is import the SocialLoginButton component from the react-stormpath module, superseding the existing LoginForm component.

    import { SocialLoginButton } from 'react-stormpath';

    You will need to do just two things to render the button: specify Facebook as the login provider, and add some text for the login button.

    <SocialLoginButton providerId="facebook">Sign in with Facebook</SocialLoginButton>

    Putting it all together, your file should look like this:

    import React from 'react';
    import DocumentTitle from 'react-document-title';
    
    import { SocialLoginButton } from 'react-stormpath';
    
    export default class LoginPage extends React.Component {
      render() {
        return (
          
            

    Login


    Sign in with Facebook
    ); } }

    Run the Application

    facebook-login-window

    With everything configured, you’re ready to run the app. Execute the npm start command to spin up an Express server at http://localhost:3000. You’ll recall from the Facebook app configuration step that this is the URL of the platform you whitelisted. When the server is up, navigate to that URL and click the Login link in the header. You won’t see much there but you’ll know it’s working if you see a “Sign in with Facebook” button, which when clicked will redirect you to the Facebook Login page.

    stormpath-app-greeting

    After entering your Facebook credentials, you’ll be redirected back to the example app’s home screen where you should be greeted by name!

    Summary

    With this simple application you’ve built, you’re actually able to do some really cool stuff. A user can log in to their Facebook account and your application will immediately be able to make use of basic user information like name and email address. From a user’s perspective, it’s so convenient to have one account that they can securely log in to and manage their personal information centrally. Stormpath makes it easy to hook into Facebook’s SDK with its built-in social integration.

    You also can read more about Facebook integration with Stormpath in the Facebook Login Guide. Or, dive deeper into Stormpath’s React support with these resources:

    If you have any comments or questions about logging into your React applications with Facebook and Stormpath, feel free to comment below, hit us up on Twitter at @goStormpath, or contact us directly at support@stormpath.com.

    The post Add Facebook Login to Your Existing React Application appeared first on Stormpath User Identity API.

    Tutorial: Establish Trust Between Microservices with JWT and Spring Boot

    $
    0
    0

    If you’ve never heard of JWTs (JSON Web Tokens), well then you didn’t read my last post on CSRF Protection with JWTs. To briefly recap: JWTs can be used wherever you need a stand-in to represent a “user” of some kind (in quotes, because the user could be another microservice). And, they’re used where you want to carry additional information beyond the value of the token itself and have that information cryptographically verifiable as security against corruption or tampering. In this post, we’ll talk about using JWTs to establish trust between microservices.

    For more information on the background and structure of JWTs, here’s the IETF specification.

    The code that backs this post can be found on GitHub. The example application makes use of the JJWT library. This Java JWT library has over 1000 stars on Github. As you’ll see further on in this post, the library has a very readable and easy to use fluent interface, which contributes to its popularity.

    Let’s begin at the beginning.

    What are Microservices?

    If you ask 10 different people the question, you might get 10 different answers. One of my favorite treatments on the subject is from Martin Fowler. In short, it’s “componentization via services” (direct quote from the article). By that, we mean identifying discrete business capabilities and exposing them as standalone services that can be scaled independently.

    In the distant past of 3 – 5 years ago, we had monolithic service oriented architectures:

    monolithic-soa-2

    If, say, your AuthenticationService started to get bogged down, it would make your GroupService unresponsive as well. Sure, you could deploy a copy of your big beefy server and put a load balancer in front of them, but that’s a lot of horsepower to throw at one overloaded service.

    What if we bust up our big beefy server into a number of smaller ones, each responsible for a piece of the architecture, like this:

    microservices-2

    Now if the AuthenticationService bogs down you can scale it independently. Maybe you’ll need 10 small instances of the AuthenticationService and only 2 instances of all the other services. Even better: if I’ve already authenticated (in this example), I can go right to the GroupService.

    But, with all this awesomesauce, we’ve introduced a new problem: all of these independently running microservices need to communicate with each other and they need to do so in a secure manner.

    We can use JWTs to not only carry information between microservices, but by the very nature of JWTs we can cryptographically verify the signature, proving that they have not been tampered with.

    JWTs and Microservices in Action

    Let’s fire up some microservices and see communication between them in action. This example exposes an API to demonstrate communication between microservices.

    Execute the following to build the example:

    git clone https://github.com/stormpath/JavaRoadStorm2016
    cd JavaRoadStorm2016
    cd roadstorm-jwt-microservices-tutorial
    mvn clean install

    Now, let’s run two instances of the example. This will serve as our simulated microservices environment:

    target/*.jar --server.port=8080 &
    target/*.jar --server.port=8081 &

    Let’s see the “happy path” in action – we’ll exercise some endpoints to show that a microservice trusts itself. We can do this using a command-line http tool. My personal favorite is HTTPie:

    http localhost:8080/test-build
    
    
    HTTP/1.1 200
    ...
    
    
    {
        "jwt": "eyJraWQiOiI4YjA3NzhkOC01MGJiLTQ1ZjAtYmYyMC1lYzg0ZDU3NTQ4NWYiLCJhbGciOiJSUzI1NiJ9...",
        "status": "SUCCESS"
    }

    http localhost:8080/test-parse?jwt=eyJraWQiOiI4YjA3NzhkOC01MGJiLTQ1ZjAtYmYyMC1lYzg0ZDU3NTQ4NWYiLCJhbGciOiJSUzI1NiJ9...
    
    
    HTTP/1.1 200
    ...
    
    
    {
        "jwsClaims": {
            "body": {
                "exp": 4622470422,
                "hasMotorcycle": true,
                "iat": 1466796822,
                "iss": "Stormpath",
                "name": "Micah Silverman",
                "sub": "msilverman"
            },
            "header": {
                "alg": "RS256",
                "kid": "8b0778d8-50bb-45f0-bf20-ec84d575485f"
            },
            "signature": "..."
        },
        "status": "SUCCESS"
    }

    The first command spits out a JWT. The second command parses the JWT passed in. The build operation uses the microservice’s auto-generated private key to sign the JWT. And, the parse operation uses the matching public key to verify the signature.

    Now, let’s repeat the parse command, but this time, against our second microservice – the one running on port 8081:

    http localhost:8081/test-parse?jwt=eyJraWQiOiI4YjA3NzhkOC01MGJiLTQ1ZjAtYmYyMC1lYzg0ZDU3NTQ4NWYiLCJhbGciOiJSUzI1NiJ9...
    
    
    HTTP/1.1 400
    ...
    
    
    {
        "exceptionType": "io.jsonwebtoken.JwtException",
        "message": "No public key registered for kid: 8b0778d8-50bb-45f0-bf20-ec84d575485f. JWT claims: {iss=Stormpath, sub=msilverman, name=Micah Silverman, hasMotorcycle=true, iat=1466796822, exp=4622470422}",
        "status": "ERROR"
    }

    Here we see that our 8081 microservice can’t parse the JWT. Trust has not been established between the two microservices.

    Public Key Infrastructure and JWT

    Now’s a good time to take a step back and look at how these JWTs are built and parsed in this example.

    When the Spring Boot application is first started, the microservice creates a key-pair for itself. That is, it creates a private key and a public key. Every JWT that’s created from the example API is signed using the microservice’s private key. The public key is then used to verify the signature. This uses the RSA crypto libraries provided by java and supported by the JJWT library. Here’s a great article on the inner workings of RSA crypto.

    Here’s the code that creates the key pair when the microservice starts:

    public PublicCreds refreshMyCreds() {
        myKeyPair = RsaProvider.generateKeyPair(1024);
        kid = UUID.randomUUID().toString();
    
    
        PublicCreds publicCreds = getMyPublicCreds();
    
    
        // this microservice will trust itself
        addPublicCreds(publicCreds);
    
    
        return publicCreds;
    }

    Notice that in addition to the key pair, we are creating a unique key ID. This will become important later.

    Build and Sign a JWT

    The /test-build endpoint is defined in the SecretServiceController. It’s job is to create a JWT with some hard-coded custom and registered claims. It then signs the JWT using the microservice’s secret key. Let’s jump into the code:

    @RequestMapping("/test-build")
    public JWTResponse testBuild() {
        String jws = Jwts.builder()
            .setHeaderParam("kid", secretService.getMyPublicCreds().getKid())
            .setIssuer("Stormpath")
            .setSubject("msilverman")
            .claim("name", "Micah Silverman")
            .claim("hasMotorcycle", true)
            .setIssuedAt(Date.from(Instant.ofEpochSecond(1466796822L)))   // Fri Jun 24 2016 15:33:42 GMT-0400 (EDT)
            .setExpiration(Date.from(Instant.ofEpochSecond(4622470422L))) // Sat Jun 24 2116 15:33:42 GMT-0400 (EDT)
            .signWith(
                SignatureAlgorithm.RS256,
                secretService.getMyPrivateKey()
            )
            .compact();
        return new JWTResponse(jws);
    }

    The JJWT library uses a modern fluent interface along with the builder pattern and method chaining. Line 3 kicks us off with a static method call that returns a JWT Builder object to us. Each successive method call adds to our JWT configuration until finally the compact method is called, which returns the resultant signed JWT in its string form.

    On line 4, we set the public key id as a header param on the JWT (that will become important in just a bit).
    On line 11, we sign the JWT with this microservice’s private key.

    If you look at the resulting JWT in a handy tool like (shameless plug) jsonwebtoken.io, you’ll see:

    {
     "typ": "JWT",
     "alg": "RS256",
     "kid": "cb5beb41-440d-4d14-9c6b-66199029ce19"
    }

    and

    {
     "iss": "Stormpath",
     "sub": "msilverman",
     "name": "Micah Silverman",
     "hasMotorcycle": true,
     "iat": 1466796822,
     "exp": 4622470422
    }

    Parse and Verify a JWT

    Next, let’s take a look at the code that backs the /test-parse endpoint. There’s some real JJWT magic happening here:

    @RequestMapping("/test-parse")
    public JWTResponse testParse(@RequestParam String jwt) {
        Jws jwsClaims = Jwts.parser()
            .setSigningKeyResolver(secretService.getSigningKeyResolver())
            .parseClaimsJws(jwt);
    
        return new JWTResponse(jwsClaims);
    }

    Notice that this entire method is basically 3 lines.

    Line 3 is a call to a static method to get us a JWT Parser Builder object.

    Line 5 actually parses the incoming JWT string. Per the JWT spec, if the JWT is a JWS (signed JWT), the parser must verify the signature.

    That’s where little old line 4 comes in. It looks so small and unassuming. But, it packs a wallop into one line of code.

    It may not seem obvious, but there’s a chicken-and-egg problem here. We need a key in order to parse the JWT. In order to lookup the key, we need to examine the information in the header. But, we don’t yet know if we can trust that this JWT hasn’t been tampered with. You see? Chicken and egg.

    The JJWT addresses this by using a SigningKeyResolver interface. This enables us to choose (resolve) a key in-flight as it were while we are parsing. Let’s look at the code from the SecretService and see what’s going on.

    private SigningKeyResolver signingKeyResolver = new SigningKeyResolverAdapter() {
        @Override
        public Key resolveSigningKey(JwsHeader header, Claims claims) {
            String kid = header.getKeyId();
            if (!Strings.hasText(kid)) {
                throw new JwtException("Missing required 'kid' header param in JWT with claims: " + claims);
            }
            Key key = publicKeys.get(kid);
            if (key == null) {
                throw new JwtException("No public key registered for kid: " + kid + ". JWT claims: " + claims);
            }
            return key;
        }
    };
    
    public SigningKeyResolver getSigningKeyResolver() {
        return signingKeyResolver;
    }

    On line 1, we are using a SigningKeyResolveAdapter. This is the common pattern of using an adapter that has empty implementations for all the methods in the interface so that we can implement only those methods we care about. In this case, the one method we are overriding is the public Key resolveSigningKey(JwsHeader header, Claims claims) method. Notice that the method gets the JwsHeader and the Claims objects passed in. And, the method will return a Key, which could be null if it’s unable to be resolved. This allows us to safely examine the header and possibly the claims while we are in the middle of parsing the JWT.

    Now, I promised you I’d explain the use of the kid in the header and we’ve arrived at that moment! In this “poor man’s” key manager, each registered public key is stored in a Map identified by the unique kid. This enables the microservice to establish trust with itself and other microservices by adding public keys to the collection. As we touched on before, when the microservice starts up and generates its keypair, it immediately registers its own public key so as to trust itself when parsing JWTs signed with its own provate key.

    Line 4 attempts to get the kid header parameter. If it there is no kid an exception is thrown. Earlier, when we tried to have our second microservice parse a JWT from the first microservice, we made it past this hurdle as there was a kid in the header.

    Line 8 attempts to retrieve the public key that matches the private key used to sign the JWT based on the kid value it found. If it’s not able to find the public key, then an exception is thrown. It’s here that our attempt to parse the JWT failed earlier. In the next section, we’ll see how we can register the public key from one microservice with another, thereby establishing trust.

    Assuming a key was found in the collection, it is returned from the method which in turn allows the JWT Parser to verify the signature and complete parsing the JWT.

    Establish Trust Between Microservices

    Now that we’ve seen the mechanism by which JWTs are verified and parsed, let’s look at how we can establish trust between microservices.

    The SecretServiceController exposes two additional endpoints: /get-my-public-creds and /add-public-creds. The first endpoint outputs a base64-urlencoded version of the microservice’s public key. This is safe to do as this type of key is meant to be distributed publicly. You could tweet it and include it in your email signatures and that would be just fine.

    http localhost:8080/get-my-public-creds
    
    HTTP/1.1 200
    Content-Type: application/json;charset=UTF-8
    Date: Wed, 07 Dec 2016 03:06:36 GMT
    Transfer-Encoding: chunked
    
    {
        "b64UrlPublicKey": "MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQC-...",
        "kid": "06adff6d-1351-4e27-8ab5-7a9fc837ad34"
    }

    Note: the b64UrlPublicKey in the output is NOT a JWT. It is simply a text based version of the binary public key.

    The b64UrlPublicKey and kid can be sent to the other microservice after which that microservice will be able to verify JWTs from the first microservice. This is what I mean by establishing trust. Before, the second microservice very literally could not verify the signature of a JWT from the first microservice as it didn’t have it’s public key on record. Here’s what this looks like:

    http POST localhost:8081/add-public-creds \
    b64UrlPublicKey=MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQC-... \
    kid=06adff6d-1351-4e27-8ab5-7a9fc837ad34
    
    HTTP/1.1 200
    Content-Type: application/json;charset=UTF-8
    Date: Wed, 07 Dec 2016 03:12:21 GMT
    Transfer-Encoding: chunked
    
    {
        "b64UrlPublicKey": "MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQC-...",
        "kid": "06adff6d-1351-4e27-8ab5-7a9fc837ad34"
    }

    The method that backs the /add-public-creds endpoint responds with a 200 status and spits back the encoded public key and kid to indicate that the public key was successfully added to its internal collection.

    Now, let’s try to take the JWT from our first microservice and use the second microservice to parse it once again.

    http localhost:8081/test-parse?jwt=eyJraWQiOiIwNmFkZmY2ZC0xMzUxLTRlMjctOGFiNS03YTlmYzgzN2FkMzQiLCJhbGciOiJSUzI1NiJ9...
    
    HTTP/1.1 200
    Content-Type: application/json;charset=UTF-8
    Date: Wed, 07 Dec 2016 03:19:53 GMT
    Transfer-Encoding: chunked
    
    {
        "jwsClaims": {
            "body": {
                "exp": 4622470422,
                "hasMotorcycle": true,
                "iat": 1466796822,
                "iss": "Stormpath",
                "name": "Micah Silverman",
                "sub": "msilverman"
            },
            "header": {
                "alg": "RS256",
                "kid": "06adff6d-1351-4e27-8ab5-7a9fc837ad34"
            },
            "signature": "QzR95gK9ly3Cr6hB-5OK-YHDUL2WbP1geG2m5oGH0IfSH8Z-..."
        },
        "status": "SUCCESS"
    }

    Now that trust has been established (by way of registering the first microservice’s public key), the second microservice is able to properly parse the JWT from the first microservice.

    JWTs for Trust and Data

    So far, we’ve been looking at some test endpoints that build and parse JWT’s. The application in this example exposes a couple of other endpoints that simulate more realistic microservices communication.

    The /account-request endpoint of one microservice takes some search parameters and generates a JWT that can be passed to another microservice. Here’s where the real value of JWTs comes into play. The JWT is used both as a token to prove identity and carries additional information encoded into it that the receiving microservice can use to perform some action.

    The AccountService has some hardcoded dummy account information to simulate a service that might query against a database for accounts. It also expects that there’s a JWT set in the standard Authorization header using the Bearer scheme.

    Let’s take a look at this in action:

    http localhost:8080/account-request userName=anna
    
    HTTP/1.1 200
    Content-Type: application/json;charset=UTF-8
    Date: Wed, 07 Dec 2016 04:26:23 GMT
    Transfer-Encoding: chunked
    
    {
        "jwt": "eyJraWQiOiIwNmFkZmY2ZC0xMzUxLTRlMjctOGFiNS03YTlmYzgzN2FkMzQiLCJhbGciOiJSUzI1NiJ9...",
        "status": "SUCCESS"
    }

    The jwt value can now be passed over to our second microservice which is acting as our account service:

    http localhost:8081/restricted \
    Authorization:"Bearer eyJraWQiOiIwNmFkZmY2ZC0xMzUxLTRlMjctOGFiNS03YTlmYzgzN2FkMzQiLCJhbGciOiJSUzI1NiJ9..."
    
    HTTP/1.1 200
    Content-Type: application/json;charset=UTF-8
    Date: Wed, 07 Dec 2016 04:30:37 GMT
    Transfer-Encoding: chunked
    
    {
        "account": {
            "firstName": "Anna",
            "lastName": "Apple",
            "userName": "anna"
        },
        "message": "Found Account",
        "status": "SUCCESS"
    }

    This indicates that, first, the microservice was able to validate the incoming JWT and second, the microservice was able to use the information contained in the JWT to lookup an account.

    If we don’t include the Authorization header at all, the response looks like this:

    http localhost:8081/restricted
    
    HTTP/1.1 401
    Content-Type: application/json;charset=UTF-8
    Date: Wed, 07 Dec 2016 04:29:36 GMT
    Transfer-Encoding: chunked
    
    {
        "exceptionType": "com.stormpath.tutorial.exception.UnauthorizedException",
        "message": "Missing or invalid Authorization header with Bearer type.",
        "status": "ERROR"
    }

    And, by the way, the JWT expires in one minute. A short expiration is good practice for microservice-to-microservice communication. Here’s what happens if we try to do the same search with an expired JWT:

    http localhost:8081/restricted \
    Authorization:"Bearer eyJraWQiOiIwNmFkZmY2ZC0xMzUxLTRlMjctOGFiNS03YTlmYzgzN2FkMzQiLCJhbGciOiJSUzI1NiJ9..."
    
    HTTP/1.1 400
    Connection: close
    Content-Type: application/json;charset=UTF-8
    Date: Wed, 07 Dec 2016 04:29:21 GMT
    Transfer-Encoding: chunked
    
    {
        "exceptionType": "io.jsonwebtoken.ExpiredJwtException",
        "message": "JWT expired at 2016-12-06T23:27:23-0500. Current time: 2016-12-06T23:29:21-0500",
        "status": "ERROR"
    }

    Note: There is no additional code for you to write to enable this behavior. This is one of the great things about the JWT spec and the JJWT library: a compliant parser must fail parsing a JWT that has an exp claim and that exp claim has a value in the past.

    Bonus: Ditch HTTP/1.x for More Scalable Microservices

    We’ve come a long way on our JWT and microservices communication journey! This last section is a bonus that’s more about modern microservice architecture than about JWT specifically.

    So far, all of our examples have been interactions with HTTP. The example project is a Spring Boot application that naturally and easily “speaks” HTTP. However, modern microservices demand a more performant architecture. Why is HTTP less performant? It comes down to the synchronous nature of HTTP/1.x. That is, you make a request and wait around for a response. While it’s great for our web browsers, it does not scale well as a microservices architecture.

    A better architecture that does scale very well is one based on asynchronous messaging. One of the more popular messaging servers is Apache Kafka. While it’s written in Java, interacting with it in Java is not a requirement. This is a pub/sub system. That is, publish/subscribe. Producer clients can publish messages to Kafka and Consumer clients can read messages from Kafka. The produce and consume operations are completely independent of each other.

    The example project is set up such that one microservice can behave as a producer and a second microservice can behave as a consumer. This is accomplished first by running Kafka and second by including the appropriate properties to the microservices at startup.

    An extensive dive into configuring kafka is outside the scope of this post. However, all you need to get the sample code working with Kafka is to follow their quickstart. Once you’ve downloaded Kafka, you’ll start Zookeeper and Kafka like so:

    ~/local/kafka_2.11-0.10.0.1/bin/zookeeper-server-start.sh ~/local/kafka_2.11-0.10.0.1/config/zookeeper.properties

    ~/local/kafka_2.11-0.10.0.1/bin/kafka-server-start.sh ~/local/kafka_2.11-0.10.0.1/config/server.properties

    Now, we can fire up our example as before. This time, however, we will enable the first microservice to work as a message producer and the second microservice to work as a message consumer.

    target/*.jar --server.port=8080 --kafka.enabled=true

    target/*.jar --server.port=8081 --kafka.enabled=true --kafka.consumer.enabled=true

    Note: You’ll need to establish trust between these microservices as before using the /get-my-public-creds and /add-public-cred endpoints.

    This time, we’ll perform our account lookup using the /msg-account-request endpoint. This will:

      1. Respond with a JWT over HTTP
      2. Produce a message with the same JWT
      3. Publish it to Kafka

    We should see our consumer microservice automatically read and parse the incoming JWT message and log some output. Let’s check it out:

    http http://localhost:8080/msg-account-request userName=anna

    This results in the same HTTP response as before. However, if we flip over to our second microservice, we should see this in the log output:

    2016-12-07 00:20:25.631  INFO 12373 --- : record offset: 0, record value: eyJraWQiOiIwMjQwYWEyMy0xMjZlLTQ3MDctOWZjYy0zODE2YzBhZGEyMmYiLCJhbGciOiJSUzI1NiJ9...
    2016-12-07 00:20:25.657  INFO 12373 --- : Account name extracted from JWT: Anna Apple

    It may not look like much, but behind the scenes, a JWT was created with our search terms and sent to Kafka as a message. Here’s the code that made that happen:

    @RestController
    public class MessagingMicroServiceController extends BaseController {
    
        @Autowired(required = false)
        SpringBootKafkaProducer springBootKafkaProducer;
    
        private static final Logger log = LoggerFactory.getLogger(MessagingMicroServiceController.class);
    
        @RequestMapping("/msg-account-request")
        public JWTResponse authBuilder(@RequestBody Map claims) throws ExecutionException, InterruptedException {
            String jwt = createJwt(claims);
    
            if (springBootKafkaProducer != null) {
                springBootKafkaProducer.send(jwt);
            } else {
                log.warn("Kafka is disabled.");
            }
    
            return new JWTResponse(jwt);
        }
    }

    The main method of our Spring Boot application sets up the microservice as a Kafka consumer is the properties are set properly.

    public static void main(String[] args) {
        ConfigurableApplicationContext context = SpringApplication.run(JJWTMicroservicesTutorial.class, args);
    
        boolean shouldConsume = context
            .getEnvironment()
            .getProperty("kafka.consumer.enabled", Boolean.class, Boolean.FALSE);
    
        if (shouldConsume && context.containsBean("springBootKafkaConsumer")) {
            SpringBootKafkaConsumer springBootKafkaConsumer =
                context.getBean("springBootKafkaConsumer", SpringBootKafkaConsumer.class);
    
            springBootKafkaConsumer.consume();
        }
    }

    The consumer code has some more setup to it. You can see it in the JJWTMicroservicesTutorial.java file in the example project.

    JWTs for Fun and Profit

    In this post, we’ve seen the value of JWTs in establishing trust between microservices. The primary benefits are:

    • Verifiability — You have a high degree of confidence that a JWT has not been tampered with when the signature can be verified.
    • Automatic time stamp checks — When you have certain claims set in your JWT, such as exp, a spec compliant parser must fail if the relevant time-test is not met.
    • Additional encoded information — Aside from registered claims, you can include custom claims in your JWT. Unlike dumb tokens, this allows for meaningful data to be passed within the JWT.

    While there are no panaceas in tech, JWT goes a long way to solving multiple challenges at once: Securing communication between microservices and passing data between microservices all at once.

    Learn More About JWTs in Java

    Interested in what else you can do with JWTs in Java? Well, we’ve got some awesome resources to salve yoru curiosity:

    And as always, if you have any questions hit me up in the comments or on Twitter @afitnerd.

    The post Tutorial: Establish Trust Between Microservices with JWT and Spring Boot appeared first on Stormpath User Identity API.

    Angular and Microservices at The Rich Web Experience 2016

    $
    0
    0

    As a Developer Evangelist at Stormpath, I’m tasked with developing our integrations, as well as showing developers how to use them. I do this through blog posts and speaking at conferences/meetups. It’s been a great ride so far and I’ve really enjoyed creating our JHipster integration and our initial Angular 2 support. I’ve been speaking at conferences since 2004, so it’s pretty cool that it’s now part of my job.

    Last week, I had the pleasure of speaking at The Rich Web Experience 2016. I was very grateful to spend the week in sunny Clearwater, Florida. Especially since it dipped below freezing in my home city of Denver, Colorado. The conference was held at the Opal Sands Resort, which just opened in February 2016.

    Testing Angular 2 Applications

    I had two talks at the conference, the first on Testing Angular 2 Applications. After talking about the importance of automated tests, I dove into Angular CLI, TypeScript, Karma, Protractor, and Jasmine.

    In the demo, I showed how you can add the “x” prefix to describe and it blocks in your to disable tests — similar to how you can use @Ignore when testing Java with JUnit. You can also prefix these blocks with “f” if you want to run a single test. The demo covered unit testing with mocks, end-to-end testing with Protractor, and continuous integration with Travis CI and Jenkins. I also showed how to do continuous deployment to Heroku with both of these platforms.

    Near the end, I mentioned how we used the generator-angular-library to help create our Angular 2 SDK. This Yeoman plugin is nice in that it provides you with good testing support out-of-the-box and encourages you to increase your test coverage because it displays it when you run npm test. You can see our current test coverage in the summary below.

    =============================== Coverage summary ===============================
    Statements   : 75.18% ( 312/415 )
    Branches     : 30% ( 12/40 )
    Functions    : 54.31% ( 63/116 )
    Lines        : 74.8% ( 282/377 )
    ================================================================================

    You can find the slides I used on SlideShare or view them below.

    All of the code I showed in my demo is contained in an ng2-demo repository on GitHub. This repo also contains a tutorial in its README.adoc that shows how how to perform all the steps.

    Microservices for the Masses with Spring Boot, JWT, and JHipster

    The second talk I gave was about using microservices, their history, and how to develop them with Spring Boot and JHipster. I talked about the history of microservices, as well as the experience at Stormpath migrating to them. It was fun to tell the story about how we migrated our backend to Spring Boot in 3 weeks.

    For this talk, I created the presentation using Asciidoctor and its Bespoke converter. I published the presentation and all its demos on GitHub.

    The demos were as follows:

    1. Spring Boot with JPA, H2 and a REST API. Integrated Stormpath and deployed to Cloud Foundry. Instructions to reproduce this demo are on GitHub.
    2. Create a simple blog application with JHipster and deployed to Heroku. You can read the instructions or check out the JHipster Blog Demo I recorded earlier this year.
    3. Create microservices with JHipster, converting the blog app to a gateway, generating a “store” microservice, and running it all with Docker and Kubernetes. Instructions to reproduce this demo are on GitHub.

    I exported a PDF of this presentation and published it on SlideShare. You can also view it below.

    In addition to the PDF, you can view the HTML version of the slides in your browser. The nice thing about the HTML version is you can use this URL and see all my speaker notes.

    I used Asciidoctor and published all my demos for this talk on GitHub so others can fork it and make it better! I also plan to use it as a basis for the next section of the JHipster Mini-Book on InfoQ. The 2nd edition of the book was just published last week, so make sure to download a free copy while it’s still fresh.

    Future Talks

    Next year, I’ll be doing these talks as Stormpath webinars. I’ll also be speaking about Angular 2 at Jfokus, JHipster at DevNexus, and JHipster + Asciidoctor at the first-ever Devoxx US.

    If you have any questions about the talks mentioned here, please leave a comment, hit me up at @mraible on Twitter, or enter an issue in their respective GitHub repositories.

    The post Angular and Microservices at The Rich Web Experience 2016 appeared first on Stormpath User Identity API.

    What the Galactic Empire Could Learn From OWASP

    $
    0
    0

    Security is crucial for any project, whether you’re building a hobby application on the terrestrial internet or a fully operational battlestation in a galaxy far, far away. That said, security isn’t easy. Every few years, the OWASP group publishes the Top Ten list, which reviews the most common security mistakes in applications across the internet. The same few vulnerabilities have been at the top of the list for years: SQL injection, broken session management, cross-site scripting (XSS) vulnerabilities.

    The details of these attacks have been well-known over a decade, but they still top the list. Even when we know better, it’s easy to keep making the same mistakes over and over again.

    These mistakes can have profound implications. Verizon’s multi-billion dollar purchase might fall apart because of Yahoo’s knack for setting records with really big data breaches. A Russian hacker claims to have breached the U.S. Election Assistance Commission because of an unpatched SQL injection flaw.

    I find your security vague and unconvincing

    In the Galaxy Far Far Away, these same types of security mistakes led directly to the data leak that doomed the Death Star. (Be warned: spoilers for Rogue One ahead.)

    • Strong authentication and session management
      In a recommendation straight from OWASP, the stolen freighter should have never been allowed through the shield gate on Scarif with expired credentials. Whether it’s access tokens or callsigns, the ability to enter a highly secure system should be properly expired. And, when the client (or ship) presents authentication tokens through an untrusted connection, the tokens should be validated to make sure they haven’t been forged or tampered with.
    • Multifactor authentication
      At any point during or after the initial intrusion, requiring multiple types of authentication would have prevented the data breach. This is multifactor authentication in a nutshell: sometimes it’s possible to steal a password (or a freighter), but stealing a password and a second factor is much more difficult.

    If these basic security principles had been followed, it would have been impossible for the rebel scum crew of Rogue One to leak the critical information that led to the outcome of the Battle of Yavin. (Whether that’s good or bad depends entirely on your point of view, of course.)

    How to do security right

    If you’re building a battlestation, make sure you hire competent security professionals, and don’t make same mistake three times in a row.

    If you’re building something a little closer to home, you’ll need to securely handle authentication and identity management. If you don’t want the risk of building it yourself, we can be a useful ally.

    Stormpath provides best-in-class security for concerns like authentication, authorization, single sign-on, and social login for web and mobile apps. Check out one of our quickstarts for your favorite web framework!

    And don’t forget to review the OWASP Top Ten list, no matter how you’re building your application. Nothing less than the future of the galaxy could be at stake.

    The post What the Galactic Empire Could Learn From OWASP appeared first on Stormpath User Identity API.

    2016 Year in Review — New Tools Broaden Authentication & User Management Support

    $
    0
    0

    2016 was an awesome year for the team here at Stormpath! We hit the ground running in January and rapidly scaled both our internal staff and resources, as well as our service offerings around authentication and user management. None of this could have happened without the invaluable support, input, and feedback from our customers and partners. Here’s what we accomplished together:

    • Account Linking: This feature allows users to log into the same account via multiple methods without any extra development.
    • API Call Tracker: See how many API calls your applications are making to Stormpath at a glance in the Admin Console!
    • Custom Data Search: A customized search function for accounts based on the attributes you define.
    • Custom SMTP: Configure Stormpath to use an SMTP server of your choosing for added control over email workflows.
    • EU Enterprise Cloud: European-based applications can now keep their user data entirely within the EU, thanks to our EU region in Frankfurt, Germany.
    • JWT Inspector: The inspector is a developer tool that helps you debug JWTs directly in your browser, so you never have to leave the site you’re working on again.
    • Multi-Factor Authentication: MFA adds additional security to your application to prevent unauthorized account access.
    • Organizations: This new resource provides even easier support for multi-tenancy.
    • SAML: Stormpath-backed apps can act as SAML service providers that work with SAML services like OneLogin, Okta, Salesforce or any other SAML IdP, including home-grown and open source options.

    SDKs, Integrations, and Framework Support

    Java

    Javascript

    Mobile

    .NET

    PHP

    Python

    • Improvements to Flask-Stormpath (And stay tuned for a huge release in 2017!)

    We can’t wait to see what the next year brings!

    The post 2016 Year in Review — New Tools Broaden Authentication & User Management Support appeared first on Stormpath User Identity API.

    Viewing all 278 articles
    Browse latest View live