This is at the beginning of his 6th grade.
( first he wrote the same in to 5 rough pages then moved in to word.)
Gochaun was hunting in the forest, he was chasing some Ulies. Ulies are buffalo like creatures with the
ability to camouflage into the background. Gochaun had chased them all morning, and was
going to make his camp when he saw something glint behind a tree. He decided to go check it
out. When he got to the tree he was pretty sure the thing was an egg, but it looked nothing like
any egg he had ever seen. The egg was half a foot in diameter and was a perfect sphere. It was
pure white with thin blue stripes running down. He decided he would probably be able to sell it,
and if that didn’t work he would cook it himself. He went back to his house, his house was on
the outskirts of a small town called Fjod. He was getting ready to go to the store when he heard
something crack. He checked his bag and saw a small crack as small as a thread. The egg
suddenly started shaking and a web of cracks appeared on the egg. The tiny head of a lizard, no
a dragon poked out of a hole in the egg, then the whole body came out of the egg, there were
tiny spikes on the dragon’s back and it had translucent wings. He touched the head of the
dragon and a searing pain ran up his arm. He also heard a voice in his mind “I am Fuun” he
replied telepathically “so it is Fuun?”, but Fuun was already what seemed to be asleep. In the
midst of his shock Gochaun did not notice Tajun approach. “You have one of the last dragons”
said Tajun, Gochaun turned to face Tajun, “what do you know about dragons” asked Gochaun
“very little or very less, dragons do extraordinary things” replied Tajun. Gochaun remembered
the pain in his arm and looked at his arm, he saw a strange tattoo of a limbless dragon running
down his whole arm. “That is the mark of a dracolite” explained Tajun then Tajun lifted his
sleeve and Gochaun saw a similar tattoo but had fire “why is your tattoo different than mine?”
asked Gochaun “my dragon is fire and your dragon is ice” replied Tajun. During the
conversation neither Gochaun nor Tajun noticed Fuun was freezing Gochaun’s hair. Gochaun
turned to check on Fuun and saw what Fuun was doing, immediately Gochaun turned his bag
upside down and Fuun glided out and landed at Gochaun’s feet “he is hungry” said Tajun and he
handed Gochaun a chicken. Gochaun held a little bit out in his hand and Fuun gobbled it up
hungrily. “More” said Fuun telepathically, soon Gochaun had fed Fuun the whole chicken then
then Fuun was satisfied. Gochaun and Tajun talked some more and Gochaun agreed to have
Tajun as his mentor. Dokun the lord of all evil heard the whole conversation and planned to
kidnap Fuun and use him to take over the world. The next day Gochaun had his first lesson with
Tajun and Fuun had his first lesson with Libai (Tajun’s dragon). The next week Dokun striked,
while Fuun was in the woods exploring Dokun sent his demons from the nether realm to restrain
Fuun’s power and take him to the dragon’s cage the only place that can restrain a dragon. Fuun
knew it was no use to fight back when he had no power. Fuun mentally told Gochaun what
happened and out of his rage Gochaun teleported to Dokun, and started to fight. Gochaun and
Dokun fought for 7 days and 7 nights. When finally Dokun teleported Gochaun to the Madaar
desert and Gochaun realized that he needed to control his rage. Gochaun tried to teleport to
Fuun but he could not lock onto Fuun. He realized he needed to break into the castle by force
and he needed the help of the dwarves who lived on the other side of the desert so he set off.
Daily I help teams with solution engineering aspect of connected vehicle data projects. (massive datasets & always some new datasets with new car models aka new technologies.) Lately in the spare time, applying some of the ML/Deep learning techniques on datasets (many are create based on observations of real datasets)To Share some thoughts on my work (main half of this blog) and the other half will be about my family and friends.
Friday, October 02, 2015
Monday, August 31, 2015
Tuesday, August 25, 2015
A relocation journey from MN to TX
I don’t want to discuss why I made decision to move to TX
from MN. (But money was a not factor.)
However I was told that there is no state tax in TX, you will save
a lot after relocation etc.
So in this post, i will starts with finance discussions. ( I have to start with something.)
In MN I used to own a small home (1750 Sq. foot & I used
to pay ~2K property taxes.1% of home value)
After moving to TX, I brought a home 3300 Sq. foot & now I
am paying 10.5K as taxes on 440K home. (Around 2.5% of home value)
In TX, I am not paying any state tax on my full time pay
check.
On similar salary, I used to pay 6K MN state tax.
In summary, property tax in TX alone equates
property tax in MN + state tax.
For one year, I rented an apartment before buying home here
in Austin, TX
Again if are compare renting at MN vs, TX.
In a good neighborhood,
rent is minimum $300 to $400, more on similar size apartment compared to MN.
On average for 12 months,
this is equals MN state tax.
Bottom-line, don’t assume you are going to save a lot by moving
to TX if you want to send kids to good schools. (In both approaches i.e. via rent or
home.)
Coming to public services, for example, applying to kids’
passport etc., I used to spend 10 minutes at MN post offices. It was very quick.
Here in Austin, TX, it is more than 3 hours process.
Same case with Vehicle registration.
For some reason, smaller teams serves larger public.
Particular if both parents
works, it is kind of ½ day forced vacation situation for this kind of simple situation.
On side note, on average gas is 30 cents cheaper compared to MN. (It all depends on how many miles you drive your car. I don’t drive a lot for work...)
Another amazing about Austin TX is, school kids play lots of competitive
games right from primary school. For example, my kids started playing chess and
now they compete at many schools. We drive a lot to Dallas, Houston or San
Antonio. Not only Chess, there is lots of competition for Tennis, Swimming etc.
If you want to expose kids how competitive the worlds is this is very good
place.
Friday, July 10, 2015
few Java Util methods used in batch programs
During processing of csv files or mainframe dumps and feeding to oracle/mysql via spring batch etc.
public static convertStringSetToString(Set inputStrSet, char delimiter) {
StringBuilder sb = new StringBuilder();
if (inputStrSet != null) {
Iterator iters = inputStrSet.iterator();
while (iters.hasNext()) {
String stringValue = iters.next();
sb.append(stringValue);
sb.append(delimiter);
}
}
return sb.toString();
}
public String convertStringListToString(List inputStrList, char delimiter) {
StringBuilder sb = new StringBuilder();
if (inputStrList != null) {
Iterator iters = inputStrList.iterator();
while (iters.hasNext()) {
String stringValue = iters.next();
sb.append(stringValue);
sb.append(delimiter);
}
}
return sb.toString();
}
publicString arrayToString(T[] array) {
if (array != null) {
StringBuilder builder = new StringBuilder();
for (int i = 0; i < array.length; i++) {
Object object = (Object) array[i];
builder.append(String.valueOf(object));
if (i < array.length - 1) {
builder.append(", ");
}
}
return builder.toString();
} ///if null return ""?
}
public static convertStringSetToString(Set
StringBuilder sb = new StringBuilder();
if (inputStrSet != null) {
Iterator
while (iters.hasNext()) {
String stringValue = iters.next();
sb.append(stringValue);
sb.append(delimiter);
}
}
return sb.toString();
}
public String convertStringListToString(List
StringBuilder sb = new StringBuilder();
if (inputStrList != null) {
Iterator
while (iters.hasNext()) {
String stringValue = iters.next();
sb.append(stringValue);
sb.append(delimiter);
}
}
return sb.toString();
}
public
if (array != null) {
StringBuilder builder = new StringBuilder();
for (int i = 0; i < array.length; i++) {
Object object = (Object) array[i];
builder.append(String.valueOf(object));
if (i < array.length - 1) {
builder.append(", ");
}
}
return builder.toString();
} ///if null return ""?
}
Monday, March 23, 2015
LONE STAR OPEN SCHOLASTIC 2015
Monday, March 09, 2015
Casis Elementary Scholastic Chess Tournament
Location:
Casis Elementary
2710 Exposition Blvd
Austin, Texas 78703
It is very long day but Saketh did very good with all his
games.
Not only individual first prize trophy, he contributed maximum
points to his school trophy
His rank was increased more than 300 point with this tournament.
This is the scary part…
Dhanvi
did not perform well. (However he has tough lineup)
Monday, February 23, 2015
Indian Creek Elementary Chess
Following are few pictures from this tournament.
Both did well. In particular Dhanvi won
first prize.
(However both can be better based on their performances and past
ranks.)
Tuesday, November 18, 2014
My 2014 Black Friday wish list
With Black Friday approaching quickly, following are in my
wish list.
I have $1500 budget + $450 BestBuy + $50 Kohls Gift card.
a) Two laptops (Core i5 processor. sadly my Toshiba laptop
died this year.)
Most likely I will
go with BestBuy to get rid of Gift card or
HP Pavilion 15
Laptop Computer With 15.6" Screen from Office depot
b) Xbox 360 games
Both Dhanvi
& Saketh wants Call of Duty Advanced warfare, & two other latest games
All these games are $59.99
Looking for buy
2 & get 3ed free
c) A nice corner desk
+ desktop chair (Found something in staples)
d) Ladder badly in
need for the new house (Home depot or Lowes)
f) Nice large rug for
living room (Sam’s or Costco)
g) Tri ply Stainless
Steel cookware
My take on 2ed day 2014 Lucenerevolution at Washington, DC
Following is summary
Day started with SOLRCloud at Apple presentation
by Apple team. It is solid presentation in
terms of challenges faced during
SOlrCloud implementation at Apple. Still they are implementing and seems to be they
needed lots of new features (disaster& recovery space) and they are trying
to automate as much as possible. They
gave Some JIRA tasks and they are contributing something back to community.
Finally something back to community.
This session followed by lucid works presentation
on scaling SOLR cloud for massive data. This story builds on top of Apple. Good
one. After above general session, I attend following separate tracks.
a) Solr on HDFS – Past, Present, and Future
(Introductory): Running
SOLR on HDFC & challenges.
Solid presentation by mark miller.
This is just OK. I felt it is pure hypothetical use case. He
was saying some existing customer moved from Solr 3.1 to Solr 4.10 simply
copying Solr 3.1 config & schema,xml files in to Solr 4.10 etc. etc. I asked interesting questions after the
session. Still lots of things are hidden.
c) Multi-language Content Discovery Through Entity
Driven Search (Advanced)
Presented byAlessandro Benedetti,
Presented byAlessandro Benedetti,
Solid POC, proof of concepts, effort. (Basically with using freebase, he is trying
to classify the content.) At some point,
I worked on similar POCs. Mostly likely
some site will use for some kind subscription based searches… I don’t think
this make it big.
General talk on spatial search. Good one...
Overall I am facing different problems in this space. For my employer
geodistance is not much useful. I need a truce driving distance based search
results. After the session also, I talked to David on this one. Seems to be
this is not possible in the current SOLR Geo plugin space… I am thinking
forking David’s original code and make it customer specific.
Actually I know the challenges in Relevancy and
it is difficult to cover in 30 minutes session.
Since not many alternatives and another friend dragged me to the
session. Routine talk. Search is not
good enough. Relevancy is needed & you need to consider signal, end users,
click throw, conversion rates … Don’t expect anything from Ph.D. guy in 30
minutes talk.
f) Searching 35 Million Images by Color Using Solr
(Intermediate) Presented by Chris Becker, Shutterstock
Simply superb.
Presentation started with problem statement. I.e. problem with image
search & quick demo of how color search makes the difference. After demo he
explained how they implemented with using SOLR.
Simply I loved the approach.
10 Keys to Solr’s Future Grant Ingersoll, Lucidworks Typical Grant talk. Nothing special. I was
not expecting too.
After Apple talk, I talked to the Apple Sorl team, 2 folks, about the size of the search team. His response was Oh we are 8 to 10 people. We are small & our productivity is great etc… I stopped listening to him. He does not know
he is talking to someone who delivered 3 large scale SOLR enterprise implementations single handly….
Overall an average 2ed day. Talked to few
people on their SOlr challenges.
One of the interesting conversation is
enterprise cloud search implementation by Hitachi folks.
Final thoughts… For 3 years in a row, I was
attending lucenerevolution & now I am feeling most of the session are repeated.
(May be I am doing too many things in SOLR.)
At this point, my take is I will not attend
next year lucenerevolution. (90%, I decided on this one.)
I will enclose few picture later.
Dhanvi and Saketh Chess Updates
After moving to
Austin, Dhanvi join chess club & started playing USCF rated chess tournaments.
With
last Spicewood elementary, he completed first year.
Following
picture tells his progress.
He
dreams of becoming GM & spends daily some amount.
For
most the year, he is consistent with his play.
Sad
part is we are unable to find a right teacher. All his games based on books
& YouTube videos.
Hoping
we will find a nice chess mentor.
Saketh is also following his brother footsteps.
He started
6 months late. He started with 2014 spring Rackspace tournament.
Still he is learning & winning & enjoying.
Monday, November 17, 2014
Few picture from Spicewood Elementary chess Tournament
Both Dhanvi & Saketh did good. ( 3.5 out of 5 points)
Considering to Dhanvi is sick, his performance is good.
I will write more later.
Considering to Dhanvi is sick, his performance is good.
I will write more later.
Friday, November 14, 2014
lucenerevolution 2014 from WASHINGTON, DC NOVEMBER 13
Few pictures from the key note.
Key note from fisrt CTO is fine. A different perspective in to public/Govt partnership via IT.
I will write more about some of the sessions after reaching Austin.
"Stump the Chump" session setup is very bad.
I have an interesting problem in the supply chain problem. SO far no luck. Hopefully I will touch base few other Lucene/SOlr committers sometime in the 2ed day.
Key note from fisrt CTO is fine. A different perspective in to public/Govt partnership via IT.
I will write more about some of the sessions after reaching Austin.
"Stump the Chump" session setup is very bad.
I have an interesting problem in the supply chain problem. SO far no luck. Hopefully I will touch base few other Lucene/SOlr committers sometime in the 2ed day.
Thursday, October 30, 2014
Few picture from Kealing Chess Tournament
Dhanvi’s consistently is still issue. Will post his one year
progress sometime during thanks giving weekend.
Thursday, September 04, 2014
On demand refresh of Materialized of views
Use case is simple: Some OLTP system is committing & updating the database.
In reporting system, they want to see the updates based on end user refresh.
Core idea is whenever end user clicks 'Refresh" data, at that time, a REST API call invokes following JDBC code to refresh targeted Materialized views.
( From my code Vault... Good & Old RDBMS days..)
public RefreshDataResponse refreshView(String[] names) {
RefreshDataResponse response = new RefreshDataResponse();
HashMap map = new HashMap();
long lStartTime = System.currentTimeMillis();
try{
Connection connection = jdbcHelper.dataSource.getConnection();
String prefix = "call DBMS_SNAPSHOT.REFRESH('";
String suffix = "','?')";
for ( String name : names){
String finalCall = prefix+name+suffix;
System.out.println(" final call "+finalCall);
CallableStatement stmt = connection.prepareCall(finalCall);
String update =null;
if ( flag == true ) update = "TRUE";
else update = "FALSE";
map.put(name,update);
}
response.setStatus("OK");
response.setIdAndStatus(map);
connection.close();
} catch (Exception e) {
// TODO Auto-generated catch block
response.setStatus("ERROR");
response.setErrorMessage(e.getLocalizedMessage());
e.printStackTrace();
}finally {
}
long lEndTime = System.currentTimeMillis();
long difference = lEndTime - lStartTime;
System.out.println("MV refresh Elapsed milliseconds: " + difference + "in sec:"+difference/1000);
return response;
}
In reporting system, they want to see the updates based on end user refresh.
Core idea is whenever end user clicks 'Refresh" data, at that time, a REST API call invokes following JDBC code to refresh targeted Materialized views.
( From my code Vault... Good & Old RDBMS days..)
public RefreshDataResponse refreshView(String[] names) {
RefreshDataResponse response = new RefreshDataResponse();
HashMap
long lStartTime = System.currentTimeMillis();
try{
Connection connection = jdbcHelper.dataSource.getConnection();
String prefix = "call DBMS_SNAPSHOT.REFRESH('";
String suffix = "','?')";
for ( String name : names){
String finalCall = prefix+name+suffix;
System.out.println(" final call "+finalCall);
CallableStatement stmt = connection.prepareCall(finalCall);
String update =null;
if ( flag == true ) update = "TRUE";
else update = "FALSE";
map.put(name,update);
}
response.setStatus("OK");
response.setIdAndStatus(map);
connection.close();
} catch (Exception e) {
// TODO Auto-generated catch block
response.setStatus("ERROR");
response.setErrorMessage(e.getLocalizedMessage());
e.printStackTrace();
}finally {
}
long lEndTime = System.currentTimeMillis();
long difference = lEndTime - lStartTime;
System.out.println("MV refresh Elapsed milliseconds: " + difference + "in sec:"+difference/1000);
return response;
}
Tuesday, August 26, 2014
Search Relevancy Issues Good example.
My keyword search terms are
sony laptop @bestbuy.com
See the following screenshot with the search results.
Clearly
I am looking for sony laptop & results contains all types of items.
Also
many marketplace items.
I am looking for items from Bestbuy.
After
selecting 4Gig/8Gig facets & Bestbuy Item tab, systems display all the
laptops.
See the following picture.
(You can try the same query at amazon or Walmart
and feel the difference.)
Search Findability Issues Good example.
I don’t know what happened to BestBuy keyword search.
Lately after moving to new home, searching for fridges with
model numbers.
See the following screenshot.
My input is WSF26C3EXF. model number of the the highly rated “Whirlpool
26.4-cu Ft. Side-by-Side Refrigerator”
How bad a model number search fails without any results?
Try the same input at lowes or amazon. Item shows up like a charm.
Above case is one fine example to demonstrate “findability”
issue of Search in ecommerce site.
Next
topic is few relevancy examples.
Tuesday, August 19, 2014
Google Maps APIs… Interesting findings. (Issues with Google MAPS API webservices)
For a while, I am
using some kind of Geocoding service to figure out the Longitude & latitude
of few small business addresses so that we can suggest better service within
certain distance. As long as address is correct, most of the Geocoding services
works ok. (Some are good at Europe & Asian address whereas Google does
great job with US addresses.) However main focus of this post is, I end up some
legacy data with business addresses. (Most of the data was created early 90’s. There is no consistent way address was
created or updated or maintained.) Still
they are valid businesses & doing businesses however our data is incorrect.
See the following example.
https://maps.googleapis.com/maps/api/geocode/xml?address=3100%20N%20COMMERCE%20ST,FORT%20WORTH,MI,US
address:
3100 N COMMERCE ST, FORT WORTH, MI, US
Google comes with responses which contains corrected address belonging to TX state & TC Lat/Long
3100 North Commerce Street, Fort Worth, TX
76164, USA
Based on the data set, I
know that this business address belongs to MI State.
After ignoring first
address field, now Let/Long comes back as
Fort Worth Drive, Macomb, MI 48044, USA
Despite my input contains MI, US, Google is not
suggesting any valid Lat/Long values in MI.
So end up, making one more call to ignore first field in
my address.
I know two more use cases like this & will append to this blog post.
Bottom line dirty data leads to more dirty data.
Even with Google MAPs API, I end up calling twice.
Bad choice? Any other alternatives?
On a side note: for my input N COMMERCE ST,FORT WORTH,MI,US
bing maps gives me following Lat/long coordinates.
Bad choice? Any other alternatives?
On a side note: for my input N COMMERCE ST,FORT WORTH,MI,US
bing maps gives me following Lat/long coordinates.
Monday, August 18, 2014
SOLR deep pagination fix. (For now it scales)
So far in my earlier SOLR implementation of enterprise search, there is no
need to export large amount of data in Excel OR CSV form. However with GM
implementation, there is need to export a range of search results. (For example
300K to 500K rows of 1 million search results found). This is do perform some kind of statics
analysis of results.
SOLR Rest API does allow fetching results with different start parameters in small blocks. However they way pagination was implemented ( with the all ranking math), as start param increases, search response time will increase linearly. Following Jira tasks explains more details.
https://issues.apache.org/jira/browse/SOLR-5463SOLR Rest API does allow fetching results with different start parameters in small blocks. However they way pagination was implemented ( with the all ranking math), as start param increases, search response time will increase linearly. Following Jira tasks explains more details.
See the start & QTime param values.
SOLR 4.61 search request & response numbers:
/solr path=/select params={start=304000&q=*&json.nl=map&wt=javabin&version=2&row
s=1000} hits=698889 status=0 QTime=1089
SOLR 4.71 search request & response numbers:
/solr path=/select params={start=304000&q=*&json.nl=map&wt=javabin&version=2&row
s=1000} hits=698889 status=0 QTime=108
Key SolrJ Java code changes to use new "cursor mark" feature:
String cursorVal = "*"; <--- default="" font="" value="">--->
for ( int i =0; i< noOfTrips;i++ ){
int start = i*bucketSize;
//build search request
s.setStart(start); <-- as="" font="" set="" start="">-->
s.setCursorMark(cursorVal);
///Execute your search request
SimpleSearchResponse searchResponse = performSimpleSearch1(s);
//now process search results
//process cursor mark value
cursorVal = searchResponse.getCursorValue();
???
////////////
if (cursorVal != null )
query.set("cursorMark", cursorVal);
else
query.set("cursorMark", "*");
/// New SOLRJ API method which returns the cursor Mark value
QueryResponse response = getServer().query(query);
String value = response.getNextCursorMark();
Sunday, March 30, 2014
Rackspace Spring 2014 Chess Tournament
Both Dhanvi & Saketh participated and won trophies.
For Saketh this is his first tournament & it is positive one.
I will add more details about this event in blog post.
For now few pictures.
For Saketh this is his first tournament & it is positive one.
I will add more details about this event in blog post.
For now few pictures.
Wednesday, March 05, 2014
SOLR Velocity Template based web UI to database dictionary aka database walker
Here primary use
case is, in most of the large IT organizations, lot’s of internal IT applications
uses some kind of RDBMS as back-end and over the years, one will see hundred of
databases. Again typical SOLO style operations
i.e. one department focus on their needs only. In general one will end up
seeing lot’s of duplication of data. However In my case, I end up analyzing
very large database (hundred of schema or tables spaces and thousands of tables
and 5 digit number of columns.) views, materialized views, stored procedures
and many more. I noticed lot’s of people
are using Oracle SQL developer for analysis and keep jumping from one table or
view to other tables in other schemas. After seeing this, I wrote a small database
walker. Primary purpose is to crawls entire
Oracle data dictionary and produces some xml. I am feeding this to SOLR so that
I can build simple Google kind of interface with using SOLRs default velocity templates
based web UI to search for tables or columns or schemas or primary key and many
more. I will host entire project in the GIT hub. In this post, I am including
Oracle Table metadata only. (i.e. table columns, primary key, import & exported
keys and column meta data etc. I wrote more code to pull stored procedures code
etc.)
public static void tableInfo(DatabaseMetaData meta,String tableName,String tableType, String schemaName) throws Exception {
String catalog = null;
String schemaPattern = schemaName;
String tableNamePattern = tableName;
String columnNamePattern = null;
String outputFile = stageDir+schemaName+"_"+tableName+".xml";
File f = new File(outputFile);
if (f.exists()){
System.out.print("Skiping->"+outputFile);
return;
}
FileWriter fw = null;
try{
fw = new FileWriter(outputFile);
}catch(Exception e){
System.out.print("Error ... Skiping->"+outputFile);
return;
}
if (fw == null){
System.out.print("Unable to open file. Skiping->"+outputFile);
return;
}
fw.write("<add>\n");
fw.write("<doc>\n");
ResultSet result = meta.getColumns(
catalog, schemaPattern, tableNamePattern, columnNamePattern);
String colName = "field name=\""+"id"+"\"";
fw.write("<" + colName + ">");
fw.write(tableName);
fw.write("</field>");
fw.write( "\n");
colName = "field name=\""+"tableName"+"\"";
fw.write("<" + colName + ">");
fw.write(tableName);
fw.write("</field>");
fw.write( "\n");
colName = "field name=\""+"tableType"+"\"";
fw.write("<" + colName + ">");
fw.write(tableType);
fw.write("</field>");
fw.write( "\n");
colName = "field name=\""+"schemaName"+"\"";
fw.write("<" + colName + ">");
fw.write(schemaName);
fw.write("</field>");
fw.write( "\n");
while(result.next()){///TODO remove 2,3,3 junk
String columnName = result.getString(4);
int columnType = result.getInt(5);
String columnTypeStr = result.getString(6);
String catName = result.getString(1);
colName = "field name=\""+"colName"+"\"";
fw.write("<" + colName + ">");
fw.write(columnName);
fw.write("</field>");
fw.write( "\n");
//colName = "field name=\""+columnName+"_dtype"+"\"";
// fw.write("<" + colName + ">");
//fw.write(columnTypeStr);
// fw.write("</field>");
//fw.write( "\n");
colName = "field name=\""+"colMeta"+"\"";
fw.write("<" + colName + ">");
fw.write(columnName+","+columnTypeStr);
fw.write("</field>");
fw.write( "\n");
////pull logical data.
String[] logicalData = LogicalMetadata.getLogicalData(schemaName, tableName,columnName);
if ( logicalData != null && logicalData.length <=7){
String entityName = logicalData[3];
String entityAttrName = logicalData[4];
String entityAttrDesc = logicalData[6];
/*
colName = "field name=\""+"logicalEntityName"+"\"";
fw.write("<" + colName + ">");
fw.write(entityName);
fw.write("</field>");
fw.write( "\n");*/
String logicalName = columnName+ "_lan";
colName = "field name=\""+logicalName+"\"";
fw.write("<" + colName + ">");
fw.write(entityAttrName);
fw.write("</field>");
fw.write( "\n");
String logicalDesc = columnName+ "_lad";
colName = "field name=\""+logicalDesc+"\"";
fw.write("<" + colName + ">");
fw.write(entityAttrDesc);
fw.write("</field>");
fw.write( "\n");
}
}
result.close();
ResultSet result1 = meta.getPrimaryKeys(
catalog, schemaName, tableNamePattern);
String columnName = null;
HashSet set = new HashSet();
while(result1.next()){
columnName = result1.getString(4);
if (set.contains(columnName)){
//do nothing
}else{
colName = "field name=\""+"primaryKey"+"\"";
fw.write("<" + colName + ">");
fw.write(columnName);
fw.write("</field>");
fw.write( "\n");
set.add(columnName);
//System.out.println(" primary key" + columnName);
}
}
result1.close();
/////
set.clear();
ResultSet rs = meta.getExportedKeys(
catalog, schemaPattern, tableNamePattern );
while (rs.next()) {
String fkTableName = rs.getString("FKTABLE_NAME");
String fkColumnName = rs.getString("FKCOLUMN_NAME");
int fkSequence = rs.getInt("KEY_SEQ");
colName = "field name=\""+"ExportedKeys_Table_Colum_Seq"+"\"";
fw.write("<" + colName + ">");
fw.write(fkTableName+"."+fkColumnName+"."+fkSequence);
fw.write("</field>");
fw.write( "\n");
}
rs.close();
ResultSet foreignKeys = meta.getImportedKeys( catalog, schemaName, tableNamePattern);
while (foreignKeys.next()) {
String fkTableName = foreignKeys.getString("FKTABLE_NAME");
String fkColumnName = foreignKeys.getString("FKCOLUMN_NAME");
String pkTableName = foreignKeys.getString("PKTABLE_NAME");
String pkColumnName = foreignKeys.getString("PKCOLUMN_NAME");
colName = "field name=\""+"ImportedKeys_Table_Colum_Seq"+"\"";
fw.write("<" + colName + ">");
fw.write(fkTableName+"."+fkColumnName+"."+pkTableName+"."+pkColumnName);
fw.write("</field>");
fw.write( "\n");
}
foreignKeys.close();
fw.write("</doc>\n");
fw.write("</add>\n");
fw.flush();
fw.close();
}
String catalog = null;
String schemaPattern = schemaName;
String tableNamePattern = tableName;
String columnNamePattern = null;
String outputFile = stageDir+schemaName+"_"+tableName+".xml";
File f = new File(outputFile);
if (f.exists()){
System.out.print("Skiping->"+outputFile);
return;
}
FileWriter fw = null;
try{
fw = new FileWriter(outputFile);
}catch(Exception e){
System.out.print("Error ... Skiping->"+outputFile);
return;
}
if (fw == null){
System.out.print("Unable to open file. Skiping->"+outputFile);
return;
}
fw.write("<add>\n");
fw.write("<doc>\n");
ResultSet result = meta.getColumns(
catalog, schemaPattern, tableNamePattern, columnNamePattern);
String colName = "field name=\""+"id"+"\"";
fw.write("<" + colName + ">");
fw.write(tableName);
fw.write("</field>");
fw.write( "\n");
colName = "field name=\""+"tableName"+"\"";
fw.write("<" + colName + ">");
fw.write(tableName);
fw.write("</field>");
fw.write( "\n");
colName = "field name=\""+"tableType"+"\"";
fw.write("<" + colName + ">");
fw.write(tableType);
fw.write("</field>");
fw.write( "\n");
colName = "field name=\""+"schemaName"+"\"";
fw.write("<" + colName + ">");
fw.write(schemaName);
fw.write("</field>");
fw.write( "\n");
while(result.next()){///TODO remove 2,3,3 junk
String columnName = result.getString(4);
int columnType = result.getInt(5);
String columnTypeStr = result.getString(6);
String catName = result.getString(1);
colName = "field name=\""+"colName"+"\"";
fw.write("<" + colName + ">");
fw.write(columnName);
fw.write("</field>");
fw.write( "\n");
//colName = "field name=\""+columnName+"_dtype"+"\"";
// fw.write("<" + colName + ">");
//fw.write(columnTypeStr);
// fw.write("</field>");
//fw.write( "\n");
colName = "field name=\""+"colMeta"+"\"";
fw.write("<" + colName + ">");
fw.write(columnName+","+columnTypeStr);
fw.write("</field>");
fw.write( "\n");
////pull logical data.
String[] logicalData = LogicalMetadata.getLogicalData(schemaName, tableName,columnName);
if ( logicalData != null && logicalData.length <=7){
String entityName = logicalData[3];
String entityAttrName = logicalData[4];
String entityAttrDesc = logicalData[6];
/*
colName = "field name=\""+"logicalEntityName"+"\"";
fw.write("<" + colName + ">");
fw.write(entityName);
fw.write("</field>");
fw.write( "\n");*/
String logicalName = columnName+ "_lan";
colName = "field name=\""+logicalName+"\"";
fw.write("<" + colName + ">");
fw.write(entityAttrName);
fw.write("</field>");
fw.write( "\n");
String logicalDesc = columnName+ "_lad";
colName = "field name=\""+logicalDesc+"\"";
fw.write("<" + colName + ">");
fw.write(entityAttrDesc);
fw.write("</field>");
fw.write( "\n");
}
}
result.close();
ResultSet result1 = meta.getPrimaryKeys(
catalog, schemaName, tableNamePattern);
String columnName = null;
HashSet set = new HashSet();
while(result1.next()){
columnName = result1.getString(4);
if (set.contains(columnName)){
//do nothing
}else{
colName = "field name=\""+"primaryKey"+"\"";
fw.write("<" + colName + ">");
fw.write(columnName);
fw.write("</field>");
fw.write( "\n");
set.add(columnName);
//System.out.println(" primary key" + columnName);
}
}
result1.close();
/////
set.clear();
ResultSet rs = meta.getExportedKeys(
catalog, schemaPattern, tableNamePattern );
while (rs.next()) {
String fkTableName = rs.getString("FKTABLE_NAME");
String fkColumnName = rs.getString("FKCOLUMN_NAME");
int fkSequence = rs.getInt("KEY_SEQ");
colName = "field name=\""+"ExportedKeys_Table_Colum_Seq"+"\"";
fw.write("<" + colName + ">");
fw.write(fkTableName+"."+fkColumnName+"."+fkSequence);
fw.write("</field>");
fw.write( "\n");
}
rs.close();
ResultSet foreignKeys = meta.getImportedKeys( catalog, schemaName, tableNamePattern);
while (foreignKeys.next()) {
String fkTableName = foreignKeys.getString("FKTABLE_NAME");
String fkColumnName = foreignKeys.getString("FKCOLUMN_NAME");
String pkTableName = foreignKeys.getString("PKTABLE_NAME");
String pkColumnName = foreignKeys.getString("PKCOLUMN_NAME");
colName = "field name=\""+"ImportedKeys_Table_Colum_Seq"+"\"";
fw.write("<" + colName + ">");
fw.write(fkTableName+"."+fkColumnName+"."+pkTableName+"."+pkColumnName);
fw.write("</field>");
fw.write( "\n");
}
foreignKeys.close();
fw.write("</doc>\n");
fw.write("</add>\n");
fw.flush();
fw.close();
}
Subscribe to:
Posts (Atom)