Vgo Software

Monday
Sep302013

Deploying Tomcat on Amazon's EC2 Cloud Service

Now that Oracle's released it's JaaS offering, it's time I started posting about different cloud options available to you Java developers out there.  Seeing as Amazon's services have been around as long as anybody's and are very robust at this point, it seems like a good starting point.

This article will be a quick guide to deploying Tomcat on Amazon's cloud.  It's fairly quick and it's also very cheap, even free to follow along and try this on your own if you are interested.  Simply sign up for an account here and don't go over the hours limit, considering you get 750 free hours a month, you should be able to complete this tutorial in plenty of time to avoid hitting the limit.  If you do use an Ubuntu Server image as shown in this tutorial it will cost you something.  It cost me about 26 cents to run through this, take my screenshots, and everything else for this post.  Just make sure you stop your instance after you are done with it.

The first thing you need to do after activating your account is to create an AWS instance.  For the purposes of this tutorial we are going to create a VM of an Ubuntu Server, but any Linux environment will probably follow the same procedure very closely.  The Amazon Linux instances should be available for free.

Amazon Elastic Cloud (EC2) instances can be created inside a Virtual Private Cloud or without a Virtual Private Cloud.  For a production instance you would want to use a VPC for security, for this exercise we are not going to use one.

1. Create Security Group

Once you log in to the AWS management console, the first thing we will want to do is create a Security Group.  From the AWS Management Console, click on the EC2 link which will take you to the EC2 Dashboard.  From there, click on Security Groups from the left navigation pane and then click on the Create Security Group button.

In the dialog that pops up, provide a Name, a Description, and choose No VPC.  Then click on "Yes, Create".

The security group will be created without any rules.  To create the rules you want to add to it, select it from the table of groups you are provided with.  The dialog to add rules will appear in the bottom half of the page.

 In our case we are only going to open port 80.  To do that, choose HTTP under the Create a new rule in the Inbound tab, then provide 0.0.0.0/0 (to allow port 80 access from any IP address) as the source and click on "Add Rule".

Then click "Apply Rule Changes".

 

2.  Create an Instance

Now that we have a security group, we can create our instance.  To do that click on the Instances link in the left navigation panel of the EC2 Dashboard.  This will bring up a dialog to enable the Quickstart Wizard which is what we will use for this exercise.

We will need a Key Pair in order to connect to the instance we create.  If you do not see a Key Pair listed under "Select Existing", choose "Create New", create one, and download it.  Click the Click Launch Wizard radio button, and then choose "Ubuntu Server 13.04" and provide a name.  Then click the "Continue" button.

On the next page of the dialog, click the "Edit Details" button, then on the Security Settings radio button.  This will open a multi-select box where we can pick the security groups, including the one we just created.  Highlight "quicklaunch-1" (for ssh) and "web-security-group" (assuming that's what you named it, for http) and then click "Save details".

After saving the details, click "Launch" to create the instance.  It will take a few minutes to initalize.  In the table of instances we should see our newly created instance.  When its state changes from pending to running, we should be able to move on to the next step.

3.  Installing Tomcat

Now that we have a running instance, let's log into it and deploy tomcat.  Check the box to the left of the row in the table of instances that contains the instance we created.  The click on "Actions" and then "Connect" from the dropdown.

You will need to provide the private key file in order to connect, it should have been downloaded when you created the Key Pair.  Make sure you don't lose this file.  In the dialog that pops up, fill in the path to the private key and then click "Launch SSH Client".

After connecting, we need to install Tomcat.  To do that, issue the following command from the ssh terminal.

sudo apt-get install tomcat7

If you are using the Amazon instance use yum. (sudo yum install tomcat7)  If you use yum to install, you will have to add the webapps package to see anything. (sudo yum install tomcat7-webapps)

Tomcat should install fairly quickly, just take all the defaults for any inputs it requires.

Next, start Tomcat.

sudo /etc/init.d/tomcat7 start

Since Tomcat runs on port 8080 and we would need to run Tomcat as root in order to bind it to port 80, we will redirect port 80 to port 8080 instead by issuing the following command.

sudo iptables -t nat -A PREROUTING -p tcp --dport 80 -j REDIRECT --to-port 8080

We should now be able to access Tomcat using the public dns address for our instance.  This is the same server you ssh'd into and you can also find the public dns name under the details of your instance on the EC2 Dashboard page at the bottom of the details pane.

4. Installing the Tomcat Management Webapp

If you want to install the Tomcat Management Webapp issue the following command.

sudo apt-get install tomcat7-admin

In Amazon Linux use yum again (sudo yum install tomcat7-admin-webapps).

You will also have to modify the tomcat-users.xml file in the conf directory.

cd /usr/share/tomcat7/conf

sudo vi tomcat-users.xml

Add a role for the manager-gui and a user that has the role assigned.  Please note that all roles and users are commented out in the orignal file.  A different user name that "tomcat" would probably be safer, but for me, this instance is going away as soon as I am done with the tutorial.

 

After you have saved those changes, restart Tomcat.

sudo /etc/init.d/tomcat7 restart

And that's it, you've successfully deployed Tomcat in the cloud!

Now don't forget to stop that instance from running when you are done with it.

 

Wednesday
Sep252013

Oracle's Java as a Service

I've been in San Francisco at Oracle Open World 2013 for the past few days.  It's been a great conference so far with some interesting sessions, some America's Cup racing, and lots of exhibitors to see.

One of the most interesting items to be announced by Oracle at Open World this year was their new Cloud offerings.  They've been talking a lot about the cloud for the past couple of years but it seems this year they are finally releasing some interesting products based on the cloud.

Not only do they now have an Infrastructure as a Service offering, much like Amzaon's EC2 or Microsoft Azure, but they now have a Database as a Service offering and a Java as a Service offering.

This being a Java-focused blog, the Java as a Service product is what I found most interesting and I was actually able to attend a Hands-On-Lab that demonstrated the service.  The usage is very simple, it basically exposes a managed Weblogic Server for you to deploy to.  If you want to have fine-grained control over the environment, this isn't for you, as the only control you have is what you can accomplish via the Weblogic Management Console and the interface to the JaaS itself.

As far as being able to quickly get a application up there and running, it couldn't be simpler however.  The lab demonstrated 3 different ways to do it, one with an IDE plugin (we used Eclipse, I believe there are others), one via the command line, and one via Maven.  You can also deploy and undeploy using the Weblogic console itself.

Most likely you will need more than a JaaS instance if you wanted to do anything worthwhile, for instance you will probably want a database for your application and you will need to use their DBaaS for that, but it does let you easily deploy and run Java applications (including ADF applications!) to the cloud.

Check it out here!

Wednesday
Sep042013

And now Microsoft is in the game...

A long while back in 2011, I wrote a post about Google's purchase of Motorola.  In the past couple of days came the news that 2 years later Microsoft has purchased the Nokia devices and services business.  If Google was late to the party back in 2011, Microsoft is going to have a hell of a time playing catch up in 2013.

Their much hyped Surface tablets did not quite strike a chord with customers and Microsoft is in a much different position than Google was back then.  In 2011 Google had a modestly successful Android operating system and was suffering from a fragmentation problem (still a bit of a problem, BTW).  Their acquisition of Motorola should lead to Google doing it's own manufacturing of phones that they can ensure run the latest and greatest version of their Android OS with all the best bells and whistles.  That may be just starting to happen now.  Although this won't solve the fragmentation problem, it will allow them to showcase Android.  Lately they've used deals with LG and ASUS to accomplish the same.

I don't think the problem with the Surface can have the same solution - purchasing a hardware manufacturer.  Unless this purchase allows them to create a better product at a less expensive price point, they are still going to have problems getting customers to buy into it.  If anything, the Surface proved that Microsoft is too late to attract customers with an overpriced, inferior product even if it ties in well with their desktop OS.

I, for one, am really interested to see if this acquisition can give Microsoft a boost in the mobile department but I really think the only way they are going to do that is by having a product that really wows its potential customers.  Enough so that they will switch (probably once again) to a whole different platform.

Tuesday
Sep032013

System modernization and System design

Modernization efforts come in varying levels of complexity. We recently undertook a modernization initiative for a client who wanted to migrate their existing database portfolio from Sybase to Oracle. While Oracle's SQL Developer helps with migrating the database and its various objects, the applications talking/connecting to these databases have to be remediated manually.  It is during such times we see the importance and value of time invested in good system design.  

A lot of the applications we encountered for remediation followed a consistent design. At a high level, one could view that design as an application made up of a stack of three horizontal layers - UI, Business Domain and Data Access. Most applications also consisted of a testing layer could be seen as a vertical layer, that tested one or more horizontal layers. For the most part, the unit tests (part of the Test layer) would test the Classes of the data access layer which comprised of Data Access Objects (DAO's) and Data Transfer Objects (DTO's), but because of the clean separations of the various layers you could in theory test any layer. While I am not going into the details of what each layer does, it’s not hard to guess what each layer might be doing.  



What I described in the last paragraph are two very simple things, 1) Structuring or designing the application layers such that boundaries with interfaces are enforced and in place and 2) consistently following a simple in design/structure across multiple applications.  In my opinion, as simple as these things are they matter a lot in the long term. The code written for one of these layers could be horrible but if the boundaries are enforced with well-defined interfaces, a change effort -of any kind- could be undertaken fairly easily.  An example could be that DAO layer is implemented to work with a specific RDBMS product today but in the future can be modified relatively easily to incorporate a ORM based RDBMS neutral approach or  a different RDBMS which was our case. I am giving an example of the DAO layer but this idea can be expanded to any of the layers that you find in today’s business systems.

I think in this day and age, business systems should be architected and designed for change more than anything else.  And change is best accommodated when systems are simple to deconstruct. When I write code, I deliberately re-prioritize why I am writing it, at first, for me or somebody else to read and understand, and then for the computer to execute. The computer could care less how you wrote the code. If it was just for the computer you might as well write in 0's and 1's. Note that this does not mean sacrificing on time or space efficiency choices. The same way when making design choices I put them in the context of long-term change and maintainability.

More often than not, these simple steps result in software that is relatively easy to maintain for most non-edge case scenarios. Code and Design that is fluid and can adapt over time is one of the most important things, especially in today’s time when requirements and business needs change so quickly. So the next time you work on a design or write code deliberately, change your thought process and see if that results in a better design. Worst case, you can be confident that you have explored different options before finalizing something.

Tuesday
Jun252013

Problem with Web Services in ADF - Resolved

So I spent some time working with a client last week who was attempting some interesting things with ADF.  For one, this guy was a beginner to ADF and Java and someone just threw him into the deep end.  It wasn't just "create a couple pages to allow users to update this data", it was create a web service, and not just a web service but a service that accessed more than one database.  One was an Oracle database, the other a SQL Server database.

Working wtih SQL Server itself wasn't a problem, but trying to get one application that could talk to both at the same time was a problem.  It is supposed to be possible, but it isn't the best way to do things.  The better way is to use a DB Link on the Oracle database itself.

Once we straightened that out, we encountered an error testing a simple web service.  It would try to deploy to the integrated Weblogic instance, but fail in deploymentment with an error that the service implementation class could not be found.

Now the only reason I could think of that it would attempt to deploy and give an error like that after compiling nicely was that the versions of Java were different.  If it was compiling with a newer version of Java than Weblogic was running with, that would be an issue.  I presumed that the issue would, however, also manifest itself when deploying an ADF Web Application.  Especially if it was the same app just being deployed differently.

Turns out presuming anything is a bad idea.  As it was late on Friday and I was getting close to missing my plane home, I asked the developer to send me the log and while getting a ride to the airport I took a look at it on my phone.  I found this part of a line: "C:\Program Files\Java\jdk1.7.0_21\jre\bin\java.exe -client -jar" indicating that it was indeed compiling with jdk1.7.0_21 and trying to deploy to a Weblogic server using a version of 1.6.

 

I e-mailed back the devleoper and he changed the Compiler "JDK Version Compatibility" setting to 1.6.  Low and behold - problem solved.  And just in time for the weekend too!