Sunday, September 30, 2018

Spring Boot (1.5) OAuth2 Server in Enterprise environment





The problem


If we want to have an array of microservices and support user interaction through delegated authorization, this implementation would be one of the options to consider or at least review. We have to first understand the differences between OAuth2 and e.g. OIDC before we continue explaining how to achieve OAuth2 implementation in Spring Boot in a way that is stateless and integrated so many microservices can rely on this implementation.  OAuth 2.0 is the industry-standard protocol for authorization where OpenID Connect 1.0 is a simple identity layer on top of the OAuth 2.0 protocol. It allows Clients to verify the identity of the End-User based on the authentication performed by an Authorization Server, as well as to obtain basic profile information about the End-User in an interoperable and REST-like manner. This does not mean that OAuth2 cannot do authorization, it is just that OIDC is better suited for this work.

We are going to focus on OAuth2 implementation in this article. There are several good references for reading that you can look for before venturing to create your own implementation (one, two, three, four, five, six). My reasons to do this were several, like not having OAuth2 service available, need for microservices in our architecture, uncertain structure of the client-server architecture and several other. Since we already worked with Spring Boot implementing this solution was the next best thing we could do to move our project further towards our goals.

There are two ways we can handle tokens in OAuth2 and those are plain token and JWT token. For our purposes we chose plain token implementation. The difference was that plain token needs to be verified by OAuth2 service every time it is accessed and JWT can be stored in the resource and verified by the public keys provided. Either way we can have a working solution but implementing plain tokens was a simpler and faster way to go.

Our requirements were to provide solution for the stateless authentication/authorization for the we client, server had to have a small footprint, had to be scalable, we had to see the tokens/users that we generated, we had to have revoke token capability, we had to provide automated solution to obtain the token to integration testing, microservices had to be able to both authenticate themselves and take user authenticated tokens, we had to have ability to connect to LDAP or database and ability to later support SSO from third party provider. There is always an opportunity to implement different solutions but this was something that could potentially play well into the future banking architecture and plans.


Server configuration


To start off we chose Spring Boot OAuth2 and created spring boot application. There were several configurations that we needed to implement to make our server to perform authorization and authentication.

  • WebSecurityConfigurerAdapter (to define /login, /logout, swagger, filters - we also can use @EnableOAuth2Client to additionally configure SSO client)
  • GlobalAuthenticationConfigurerAdapter (to define user details service, BCrypt password encoder and init method to distinguish between LDAP and database source). This adapter was needed as there are several filters to read users depending on the flow invoked. 
    • auth.ldapAuthentication() was starting point for LDAP
    • auth.userDetailsService(...) was starting point for user details service and password encoding
  • LdapAuthoritiesPopulator bean for LDAP custom authorities (used repository to load authorities based on user authentication)
  • AuthorizationServerConfigurerAdapter (to define OAuth2 server infrastrusture, including custom SQL queries (as we needed DB access for stateless solution across servers). This included tables like oauth_access_token, oauth_refresh_token, oauth_code and oauth_client_details. Tables are used depending on the flow invoked. Action involved overriding TokenStore, ClientDetailsService, AuthorizationCodeServices, configure(AuthorizationServerEndpointsConfigurer endpoints), configure(AuthorizationServerSecurityConfigurer security), DefaultTokenServices and configure(ClientDetailsServiceConfigurer clients) - with @EnableAuthorizationServer.
  • ResourceServerConfigurerAdapter (to define adapter that will server as entry point and configuration for any custom APIs) - with @EnableResourceServer.
  • We also needed to expose API for the user verification where we publish Principal object (this will be used by the microservices to obtain user details)
It is very important to note that adapter ordering is extremely important and that you may loose a lot of time investigating why something is not working just because of this. The order (lowest to the highest) should be Web (needed for authorization_code and implicit and stateful - because of the login page and authentication) -> OAuth2 (needed for all grants but stateless for password, refresh_token and client_credentials grants) -> Resource (needed for APIs).

We opted to use authorization_code without secret and refresh token for user authentication, client_credentials for the server and microservices that needed to authenticate themselves and password grant for the integration test cases (as this is the easiest way to obtain the token for specific user). Our client_credentials added a default role for the client and the rest added a default role for the user. This way, every authenticated client/user will have a default role to start with. This was a sure way to distinguish between human user and server API. The one problem that we still needed to solve for is propagation of the tokens in the layers of services. It is not a good practice to propagate same token between different horizontal layers due to the loss of the identification of the service doing the authorization.


Client configuration


Before we start with the microservices it is important to say that all services are defined as resources (using @EnableResourceServer annotation). This automatically means that we can for one identify a resource and enable its usage to the client for the OAuth2 configuration, and second, we can setup verification URL for the token. In order for any microservice to identify itself, we have two options for the configuration. First in the application.yml and the other programmatic (in case we need to obtain the token itself). For the first option it is useful to use it on any service that needs to implement verify_token, that is, whenever we receive the token our API will send the request to validate the token and populate user details in the SecurityContext in the spring. This is achieved in the security.oauth2.client and security.oauth2.resource entries. There we have to specify our given client id, secret, verify_token URL, resource id, user details URL and a few other parameters. The second option is to obtain token programatically, for example in the spring integration layer, where declarative approach might be difficult, non existent or depending on the extensive logic. In this case the approach is to create a client code that is annotated by the @EnableOAuth2Client. In our case this was done in the ClientHttpRequestFactory. Obtaining the token is achieved by the OAuthClient and OAuthClientRequest to our authorization server using client_credentials grant. 


In the end the goal we had was achieved and all flows are now functional and serving the purpose. The next thing we need to worry about is switching the framework to the OIDC coupled with OAuth2. This will, perhaps, be a good topic for one of the next blogs.

Sunday, April 15, 2018

MyBatis Paging, Sorting and Filtering




When considering which database persistence framework to use to complete the goal of development, we must take in consideration several factors:
  • Knowledge of your team and ability of the team leads to help out with problems
  • Available documentation
  • Maturity of the framework
  • Availability of the helper classes or supporting frameworks (e.g. how much custom functionality we need to create)
  • Underlying structure of the database and it's complexity (if exists)
  • Portability (if required)
  • "Top down" vs "bottom up" approach driven by either business requirement or existing technology
  • Whether we want or have to write SQL statements and how complex they need to be to fulfill the business goals
  • Do we "own" the data model or is this vendor maintained data model
As you can see, reaching the decision of what to use can be based on the  experience or trial and run errors.

In one of my recent projects, we reached the decision to use MyBatis as the framework that will enable us to fulfill most of the goals set upon us by the business and existing applications and database topology. In the enterprise environment, you may be faced with the decisions that span not only to your immediate application, but that may involve several others and their data models. We needed to do just that. Read data from the various data sources, integrate with several different web services (both REST and SOAP) and provide unified DB and API interface (facade). This interface, needed to bring web services and various databases together to work as one and with improved performance. Introducing ESB layer was needed but also we needed to create uniform data model (that we can model our facade objects on). MyBatis was perfect tool for this. The only problem we faced was the lack of dynamic paging, sorting and filtering (PSF) functionality, given that we needed to combine results from the different databases (where some of them had awkward design, to say the least). Hibernate was dead in the water here. We ended up using PL/SQL and SQL from various sources, using pipelines and extensive logic to bring order into the data models. This solution worked very well, and in the end, we only needed to implement and expose the Search + PSF to the API and clients. We chose this supporting framework to help us build dynamic additions to the query: squiggle-sql. In their own words:
Squiggle is a little Java library for dynamically generating SQL SELECT statements. It's sweet spot is for applications that need to build up complicated queries with criteria that changes at runtime. Ordinarily it can be quite painful to figure out how to build this string. Squiggle takes much of this pain away.
 This worked perfectly as we could expose REST API JSON with PSF parameters and reuse this through various interfaces. To avoid SQL Injection (as we were building these custom), we needed to use enumerations to match exact constructs, avoid certain risky operations like OR 1=1 or comments in filters, limit the field types and lengths. Overall, we achieved a good mix of security and usability with flexible interface.

We started first by defining the interface in JSON that must have Paging (or streaming), Sorting and Filtering functionality. Again, it is important to limit any functionality with Constants or Enums to make sure all constructs can exactly be matched to operations or underlying supporting Beans. This is important security feature.

Executing Paging in database is relatively straight forward in Oracle by adding this OFFSET <x> ROWS FETCH FIRST <y> ROWS ONLY. The other thing that is needed is to get total number of rows returned from database in order for GUI to calculate how many rows are present. This generally requires executing the statement second time without the row limiting clause or we could write a WITH statement in Oracle and execute once and link into broader query that can execute count on first occurrence and limit rows on second. Important constraint is that the page can start with 0 and up and you should not allow returning of huge pages. If you require a single result that has all records, this may be achieved by secondary API that is limited in use and users that can access it. The reason is that if parallel multiple request are executed on a huge underlying data set, it may lead on DoS type of attack.

Sorting can have multiple columns, so this means that we must match any columns for the sort with ascending or descending parameters and participating Bean properties. Sorting is added as ORDER BY clause. Important feature is to match ORDER BY clause by Enum and columns by underlying Bean. Bean that is exposed to the API should only carry fields that are necessary for API to function properly and to satisfy a business need. Any other database functionality regarding the tables should be hidden behind a services and transfer objects (ref). Order clause can be applied to the complex queries, e.g. multiple set joined by creating an encapsulating select statement around original request.

Filtering may be the trickiest one to implement due to wide range of criteria that can be applied. Same as before, using Enums to define operations and limiting search capability to the filter to use only single fields names and only AND operation is important for the aspect of security and speed. Allowing OR or free statement entry may be dangerous option for execution.

Overall, what we achieved is that now MyBatis has a potential to achieve some of the things that Hibernate has out-of-the-box.