Thursday, December 29, 2016

Microservices as a Product

Now that the year is wrapping to an end, use of microservice architecture continue on the rise. However, people are often confused in understanding the use and more importantly, rational behind creation of these services which are not to be confused with API and providing versioning of them in silo-ed manner, but to ensure that all end consumers are accommodated by them.
In a recent technology radar, a term 'Microservice As A Product' was promoted, which prompted my introspection and in my opinion is misleading because when you market a product, there are a bunch of micro services that are documented and provided, and over time, these grow and parallel services provide an alternative - while the original ones remain for backward compatibility.
It is true that established players are promoting microservices in cloud and also provide a marketplace, but treating it as a product is a bit far fetched; as terming it as tooling (a software component) rather fits the job well. Otherwise, we are bound to be stuck with the naming conventions and keep getting confused (similar to an application and a plugin).

Tuesday, August 16, 2016

Migrating Hybris with Ease

Running Hybris On Oracle - and deriving advantages of its data backup/recovery mechanism.

During a common course of Hybris implementation, at times changes made (intentionally or otherwise) in the schema/data can adversely affect the development environment and make it unstable. In some cases, initialization of system is the only resort which means losing all the manually created configurations that are made in the system and re running the impex scripts (which are not hooked up for their automatic execution). However, this is time consuming as well as requires manual intervention if the impex scripts and external settings are not properly set. Similar in design of rails active record where the database state is kept in form of migrations, the data structure is kept under impex and items configuration and updated as necessary.

To alleviate this problem in our latest project, we are using the same oracle database that is present on the staging server and just after re-initialization, resetting the database back to its saved state.
Periodically, the database team resets the master db backup to keep up with the occurring schema changes (which ensures that the additional work that a developer has to perform after 'initializing' the system is kept to minimum). So instead of running all the impex and doing synchronization, only a typical system update suffices.

Here's a short description of how the database is reset from its backed up state.


1. Disconnect from all running applications/programs that oracle user is logged into.

2. Log on to SQLPLUS as admin user: sqlplus / as sysdba

3. Drop and Create the user via the sql file with the following text (the path where the dbf files are present inside the oradata folder).

CREATE TABLESPACE YHTSPACE DATAFILE 'G:\app\oracle\oradata\HYBRIS\YHSPACE.DBF' SIZE 20M AUTOEXTEND ON LOGGING;
CREATE TEMPORARY TABLESPACE YHTEMP TEMPFILE 'G:\app\oracle\oradata\HYBRIS\YHTEMP01.DBF' SIZE 5M AUTOEXTEND ON;
define us=aphy
define pw=aphy
define ts=YHTSPACE
define tts=YHTEMP
DROP USER &us cascade;
CREATE USER &us IDENTIFIED BY &pw DEFAULT TABLESPACE &ts TEMPORARY TABLESPACE &tts PROFILE DEFAULT ACCOUNT UNLOCK;
GRANT "CONNECT","RESOURCE","CTXAPP" TO &us;
GRANT UNLIMITED TABLESACE TO &us;
ALTER USER &us DEFAULT ROLE ALL;
GRANT CREATE CLUSTER TO &us;
GRANT CREATE DATABASE LINK TO &us;
GRANT CREATE SESSION TO &us;
GRANT create ANY INDEX TO &us;
GRANT CREATE SEQUENCE TO &us;
GRANT CREATE SYNONYM TO &us;
GRANT CREATE TABLE TO &us;
GRANT CREATE VIEW TO &us;P
GRANT CREATE PROCEDURE TO &us;
GRANT CREATE TRIGGER TO &us;
GRANT CREATE TYPE TO &us;
GRANT CREATE SNAPSHOT TO &us;
GRANT EXECUTE ON CTX_DDL TO &us;
GRANT ANALYZE ANY TO &us;
grant IMP_FULL_DATABASE to &us;


4. Run the import command:
imp aphy/aphy FROMUSER=APQA TOUSER=aphy file=G:\app\oracle\oradata\HQD_22072016.dmp


While this method of managing database is time saving, it inherits the problem found in most of the tailor made approaches which involve manual intervention. Whether this is a worthwhile approach or a poor one is totally up to you and can be used in a manner depending upon your needs.

Thursday, May 19, 2016

Revving up Raspberry PI


During the past weekend, I've embarked towards setting up a raspberry pi for my computing needs. While setting up an angular based e-commerce store that utilizes SAP Hybris YAAS (which is finally getting matured), I encountered couple of interesting things that I thought I'd share.

My setup was Raspberry Pi 3 (which has an onboard wifi and bluetooth device) and a LAN cable that would connect with a laptop. As this was close to a bare-bones setup, I encountered a couple of problems initially, related with connectivity and interface.

I started by flashing Raspbian OS and then attempted to connect via SSH. However, by default the DHCP ensured that I was not able to ping due to lack of proper IP address, let alone connect with the device. On my WiFi router, the assigned IP was reflected in the interface.
Upon connecting via SSH, I updated /etc/wpa_supplicant/wpa_supplicant.conf to set the WiFi username and password. After connecting, first setup the static IP over LAN and then setup node.js and required libraries for the demonstration.

Another thing I'd like to add would be the 'raspi-config' command that is a handy utility while using the PI and lets you do various things  - which I had not known initially, and was doing that manually.
raspi-config utility


Instead of simply using the storefront as a server mode, I plan to utilize this as an input device (probably with camera module) and setup it up as an input platform to be used as an IOT device for all that's worth.

Monday, March 14, 2016

Containerization by Docker

Docker is a tool that I am getting my hands dirty with these days and instead of going after an entire process of creating a VirtualBox VM and running some lightweight Linux installation, I can simply use Docker to have a command line that behaves just like its virtualized counterpart and for large variety of tasks, this is helpful in setting up the required environment for the code/application to be demonstrated over a single machine. Instead of going the Hypervisor route of owning/supporting the entire OS or having it in a separate VM, the approach here it to use the best of breed tools without getting adversely constricted by the existing host OS.

Now that we have a windows installer, this is even more within the reach of an average lazy developer. And for what's it is worth, the docker hub has become more user friendly than before, which is kind of an app store containing the different images (the OS along with the environment).

Off to races, then!