Wednesday, August 24, 2016

Database encryption in transit, why is this still a question?

As data flows between systems, applications and databases, valuable information is pass along either internal, external or a combination of both types of networks. The data can be intercepted by several different tools including monitoring tools that companies use to verify that the applications are working and network availability. There is a great deal of security focused on the network and encrypting information being sent back and forth. Even with minimizing risk and various network security in place, there are opportunities to sniff or pull data in transit. It would seem like this security control would be a default for a secure configuration. However, there are still questions and open issues with database encryption and data encryption from to and from the database.

This seems like a first place to start for database security. Data in transit and data at rest can be encrypted with basic implementation steps and become part of a standard database build. With several database platforms these types of encryption are transparent to applications and data users. The client configurations received the data encrypted because the server has setup the proper configuration for encrypting the data in transit.

The question is still, why is this not a standard?  It could be because of other focus areas for the business or resource issues.  It can also be because this is something that is being handled after the databases have been created, and because of concerns of how the applications might handle the secured settings of the database servers if changed. However, since this configuration is part of the server, even if not part of the initial install, it can be part of the configurations and the standard build of the databases. Database as a Service (DBaaS) can provide these types of configurations as a baseline installation of the database server. Using a service like this or having these configurations as part of the deployment of the database will eliminate the question of encryption in transit. 

Set up the DBaaS with the needed encryption configuration will take care of the future standard, new builds, migrations, etc. But what about existing servers? This will need to be planned to change the configuration. Just as changes to parameters and patches are made, these server side configurations can be made and tested in non-production. As much as I would normally like to do one change at a time, there are a few things I would group this change with especially because there are limited maintenance windows. 

The database servers need to be configured to use SSL which is on the server side for both Oracle and Microsoft SQL Server. This is a high level view of what needs to be done for each database environment, which is basically parameter changes and restarting of a service or listener.
For Oracle parameters are set in the sqlnet.ora file:
For SQL Server, the Server Configuration Manager will help set the Protocols under the Network Configuration for the server. The properties for should have Force Encryption set to Yes.

Should this still really be a question? Encryption of data should just be a standard. Start by verifying that the data in transit is encrypted. Data at rest encryption would be the next step, followed by other data access controls and protection.

Tuesday, August 16, 2016

Database Olympics - Training for the Medals?

I will admit, I have been inspired by watching the Olympics. The work, training and even reinventing that the athletes do to be successful at the competitions. I enjoy sports and have participated where I could but accepted the fact over the years that being short and now getting older has created some physical limitations. It is still fun, and if you didn't know this about me, swimming was my sport that helped me fund my schooling (as a coach). I was even a flyer and a sprinter, not to keen on long distances, though a nice long slow swim is very relaxing to me now. I do get excited about the swimmers doing well and watching how the sport has changed over the years.
The photo for my blog here is even me standing next to Michael Phelps just a "few" years ago...
One might even say that I temporary retired from writing blog posts (considering how old the last post was), but now feel that this part of my new training plan. I have been impressed how he has refocused and set goals in order to accomplish what he has this year. Katie Ledecky, even though competing in the longer distances, has dominated her events, amazed by how hard she works and trains, and seen earlier photos with Phelps and might have used meeting him for some encouragement to work harder.
But this isn't an article about Phelps, it is about what we can learn and be inspired to do as Olympians in our own field. Yes, I said that we are Olympians. There is even a TV commercial out there with Kayla Harrison (awesome Judo athlete) saying that things we do normally earn us medals. (If you don't know anything about Kayla, google her and "This is My Day").
So, what are we training for, any upcoming events? Where do we have to do things differently? What conditions have changed? If we are looking at database environments, there is so much growth and potential here in these environments because of new technologies and business needs. Databases are being provided as services where self-provisioning has definitely changed the landscape for the DBAs, might even consider that DBAs are needing to adjust just because the skills are getting older and newer (younger) options are coming. The knowledge is needed to provide data intelligence and still provide highly available, well-performing and secured data sources.
We might be preparing for the migration or upgrade (12cR2???) event coming soon. Our training would be learning the new features, testing our environments and getting prepared to succeed in that event. Just image celebrating those victories with medals or other types of awards.
Doing things differently, we might adapt new technologies or automate parts of our job in order to focus on different areas to work more with the business to provide the data solutions.
Data security is a higher priority, which I have been focused on to look at access controls, data protection and how to validate and monitor the controls are in place and continue to be effective.
Processes, new technologies, working differently are things I have heard as the athletes that have come back year after year are doing in the Olympics, and why shouldn't we embrace that for what we do and how we are working in our environments.
Validate the processes and controls, work on using services and automation to not have to constantly repeat tasks to be "faster" at delivery and continue to learn and if needed refocus on areas that meet business requirements.
Let's be inspired to learn more, work more efficiently and celebrate what we do.

Thursday, October 10, 2013

Proof in the Privileges

Asking for full permissions on the database, tables or schemas may have been an easy way to do upgrades or run other database applications, but it provide a secure environment and it definitely doesn't get to the level of least privilege. Least privilege is a way to minimize access to sensitive data and granting only the permissions needed is a key step for database security.
Knowing what permissions have been granted is easy to find out by querying the catalog tables and auditing changes in grants. With Privilege Analysis we can also verify what permissions are actually being used. This analysis can provide roles that only have the permissions used or scripts to revoke the used permissions. Nice to have proof that only an SELECT or EXECUTE on a few objects is required instead of DBA role.
So how does it work? The DBMS_PRIVILEGE_CAPTURE is used to gather the information around what permissions are used and generate the results. The tables DBA_USED_PRIVS and DBA_UNUSED_PRIVS are some of the tables that contain the information from the capture.

DBMS_PRIVILEGE_CAPTURE.CREATE_CAPTURE (NAME => 'dba_capture_all_privs', DESCRIPTION => 'privilege_analysis_example_for_all_users', TYPE => DBMS_PRIVILEGE_CAPTURE.G_DATABASE);


There are options that will capture different privileges for the database, roles and context: G_DATABASE, G_ROLE, G_CONTEXT and G_ROLE_AND_CONTEXT.
Creating scripts to create a role or revoke privileges can be done with a query against the DBA_USED_OBJPRIVS or DBA_UNUSED_OBJPRIVS tables.

It is great to have proof of what privileges are being used and being able to do the analysis to help secure the environment.

There are several layers of security that are important for the environment and to get more information about Oracle Database security a few of us have written an ebook, Securing Oracle Database 12c: A Technical Primer, with references and with 12c information. It is available free for a limited time by registering at and use code: db12c. There are also a few more examples on Privilege Analysis. Be sure to check it out as there some excellent examples on auditing, encryption, handling privileged users and many others.

Monday, August 5, 2013

Pluggable databases - logging in

User group technology days and meetings are not only great for the presentations but for the hallway conversations. In a short discussion in the hall, I realized I had forgotten to mention something in my presentation the other day about pluggable databases, an Oracle database 12c new feature, and didn't even realize until having this conversation that it would be of interest to others. It is something that is fairly simple once you figure it out, but can cause a few minutes of distress or a couple of hours of doubt of how one is even a DBA and surviving.
With the pluggable databases there is quite a bit of discussion around creating and moving the pluggable database from one system to another (unplug/plug). The database creation assistant (dbca) is easily used to create a contain or pluggable database.  Normally after creating the database in a Linux environment, the DBA goes into sqlplus from the command line and logs into the database and does some validations. You will probably find that logging into sqlplus connect / as sysdba gets you connected to the container database. But how does one get to the pluggable database? And even a better question, is the pluggable database even available?
Each of the pluggable databases can be open and closed individually. Shutting down the container database will shutdown all of the pluggable databases, but startup of the container database doesn't mean all of the pluggable databases are started. To verify the pluggable database is open, log in to the container database through sqlplus and run the following:
SQLPLUS> select name,open_mode from v$pdbs;

NAME                           OPEN_MODE
------------------------------ ----------
PDB$SEED                       READ ONLY
PDBMM2                         MOUNTED

Notice that the PDBMM2 is only in MOUNTED state and not open. To then open the pluggable database run one of the following options:
SQLPLUS> alter pluggable database ALL open;
SQLPLUS> alter pluggable database PDBMM2 open;
SQLPLUS> alter pluggable database ALL EXCEPT PDBMM1;
Now that the database is open, connection to the pluggable database requires the same information we have needed to connect to any database, service name, port and host. If on the host the service name needs to be set or included with the login. The pluggable database connection is like logging into a normal database instance as in previous releases, so setting it as a ORACLE_SID or logging in with the service name included: username@PDBMM2.
The pluggable databases will be easily accessible through Enterprise Manager, and other tools like SQL Developer, but it is setup just like logging into a database instance in using the service and doesn't need the name of the container database, just the name of the pluggable database. From the server, logging in through SQLPLUS, this might be at first confusing if you are in the container database and trying to get to the pluggable database. It is a connect username@pdbmm2 that will get you there, or just setting the ORACLE_SID=PDBMM2 and then logging in through SQLPLUS.

Monday, October 1, 2012


Say it real fast and it sounds like Twiki of Buck Rogers. (At least that was part of the discussion with some other Oracle ACE Directors.) The other part of the discussion was that this can provide a great way to consolidate, patch/upgrade and maintain Oracle databases.

So what do these new acronyms mean? CDB - Container database. The container database is the global area for the database and contains the main system information. PDB - Pluggable databases. The pluggable database is the user/application information and has the user tables and system information about all of objects in the pluggable database. This is a key new feature of the Oracle 12c database.

Just start to think about what this can mean. It means I can have a few container databases (CDB) and multiple pluggable databases (PDB) in each container. I can backup and recover a PDB to a point in time, I can clone a PDB in seconds and I can plug a database into a patched CDB and have that PDB now on the patched version as well. The PDB is isolated to other PDBs and now there are security options for access to a CDB and different logins for PDBs to keep access separate. The current databases, previous versions are now non-CDBs. There are also non-CDBs available in 12c, that behave like the current database instances with schemas and shared system information. They are easy to manage in the database tools, like database creation database, Oracle Enterprise Manager and SQL Developer.
The rest of the week at OOW should provide more information about CDBs and PDBs. This is a nice new feature of Oracle 12c and provides an easy way to manage different applications in one CDB. Faster too! Another bonus.

Oracle Technology

Why is technology fun? It is always changing and providing new solutions and new innovations. If you want
have a simple career then technology is not for you. Especially DBAs we have new things happening all of the time. More data, big data, faster hardware!

Yes, OOW does speak loud and proud about the Oracle technology and things that they are doing well and how they have the best of breed in the technology stack. It does also give motivation to see how to look at things differently, provide value to the businesses.

As a DBA, some things get simpler, while there are other opportunities in our jobs to keep us challenged. Cloud offerings, Engineered Systems, better performance with software and hardware are a few things that make things simpler. DBAs have the opportunity to look at managing these engineered systems, working with cloud offerings and database as service, and even developing more in the role of a Database Machine Administrator (DMA).

There are still challenges of data, what data business needs, integration of data, securing data for the business. Is this an emerging role as well for the DBA? Do we need Big Data DBAs? What is coming out that is a new feature that are benefits and should be implemented. Even if things haven't worked in the past or seen as something important, is it now?

That might be one interesting thought here, that even with previous years at OOW not seeing cloud as important, but willing to come back and see it with a new set of eyes and how there are benefits there now, are ways we should be looking at our database environments. Take a new look, take advantage of new technology, maybe look at a direction that was rejected in the past that might be worth it now. Same with the role of the DBA, not just creating database, adding users but new tools and new opportunities.

OOW 2012 - Keynotes

Oracle Cloud and Engineered Systems mentioned already last night, and today is going to be a good day for the database. More details on the latest version of the database should be provided.

Even though the keynotes are a high level about Oracle products and the stack, they give a a good picture of what is currently important to the Oracle executives and direction that the Oracle products are heading. Get the big picture first and then follow up with sessions to dive into more details.

The other great opportunity is to network with the user community and see who is looking forward to implementing new features and what products have been game changers in their company.