It is another Saturday in the Twin Cities of Minneapolis and St. Paul. Last night I changed the alarm to wake up at 05:15 AM. That said I was up around 05:00 AM. My wife and her friend want to get an early start shopping. They should be back around 11:00 AM. They both took masks, gloves and umbrellas. The forecast does not call for rain in this area, but around the time they were leaving there were some spotty dark clouds.
The latest and greatest on liberals is to allow defilement of statues of Christopher Columbus around the country. Not sure what is the point and more important what will change with such actions.
Every morning and evening I take a look at the news on my Google Pixel phone. In my opinion, there is nothing of interest to read. A few decades ago journalism was about reporting about topics in an unbiased fashion. Today, based on the source, most (never say all) news no matter what the subject might be, have unrelated references to politics. For example, if an article is reporting about SpaceX latest achievement, there are unrelated references to politicians that made election promises a decade or two ago. I was under the impression that the larger news companies (i.e., CNN, The Washington Post, The New York Times, among others) hired editors to make sure they are reporting facts. In general, what is happening is that most news organizations have a strong political (and probably economical) affinity to some political organizations, and add themselves or are allowing, unrelated political opinions to be mixed in their reports. That is all wrong. The idea of democracy is for people to make their own opinion based on facts. Today depending on your reading habits, companies are serving you the news they believe you like based on an AI based algorithms. Such approach will falsely lead you to believe that the entire world thinks like you. Look for different sources and read about science. That will broaden your knowledge and will lead you to question what you are being fed by many companies.
On the positive side, this week I read a couple articles that I thought were quite educational and informative. The article “Meltdown: Reading Kernel Memory from User Space” in the June 2020 edition of Communications of the ACM does a great job describing the technology that is used by the Meltdown attack. It also covers some countermeasures that have been used to prevent it from happening on computers. The article has ten authors and provides the best description I have read so far.
On a separate note, the article “Random Search Wired Into Animals May Help Them Hunt” by Liam Drew describes in specific a couple patterns used by most animals to perform walks (searches) when looking for food. Of interest are Levy walks and Brownian motion. In a nutshell the hypothesis is that if you are looking for something (i.e., food) then one of the most efficient ways is using the Levy walk. Apparently many animals use it when looking for food. The big question for me is how it occurs given that it is not being taught. Most important, perhaps can we learn from it be able to apply relative concepts to AI?
By the way, these articles do not contain a single reference to politics. That is excellent writing and reporting!
Now let’s move on to the main subject of this post. I work on a storage server named iCAS. It is a distributed system that is able to store, query and retrieve documents. It runs on the cloud as a service and in private settings if needed.
I am working on a task converting all iCAS CLIs which make use of a set of iCAS APIs to return data only in JSON format.
I will cover the development steps I followed to generate a new CLI using a new API. I refer to this as Test Driven Development. Based on what I understand of TDD, it is not the process of writing tests by the developer before writing the actual code. As a matter of fact, it general, it is not a good idea to have the same developer or team write the tests and application code.
Given that the task is work related, I will not be providing the actual code.
The following table lists the main steps I took and a simple description. I will elaborate further on some steps following the table.
Step | Description |
1 | SQL code to collect data. |
2 | High level database function / method: SDBCountByGroup |
3 | Low level database function / method: sdbCountByGroup |
4 | CLI source code: countbygroup.exe |
5 | Implemented CASCountByGroup API. |
6 | Implemented the entry in the request thread to process the new API call. |
7 | Implemented DoCountByGroup to process the request in the iCAS dispatcher. |
8 | Updated GUIDListThread to process and return results back to the caller in JSON format. |
9 | Implemented JSONCountByGroup to send from the iCAS the JSON results to the client. |
Step 1 describes a simple query that seems to encapsulate what is required. I created and tested different versions of the query using SQL Server Management Studio from Microsoft. The iCAS system may use Microsoft SQL Server or MySQL as a relational database engine.
SELECT GroupID, State, cast(count(*) as bigint) as 'Count' FROM[sdmsql].[dbo].[BITFILE_TBL] group by GroupID, State order by GroupID, State asc ;--
The code shows a selection of a set of fields. The GroupID field is a 32-bit integer, the State field is an ASCII character, and the Count represents the number of objects (in this case bitfiles) that match the first two fields. Given the capacity of the iCAS server, we need the counts to hold a 64-bit integer.
GroupID State Count ----------- ----- -------------------- 0 I 7160 123 I 72 456 I 48 789 I 24 (4 row(s) affected)
The results shown are from my Windows 10 development machine. It seems that this query fulfills the requirements for the task at hand. It does not make much sense to me to start coding before I can get what I need from the database.
Steps 2 and 3. I wrote the high level call to query the database and return a list of JSON results. Wrote the code based on coding standards. The SDBCountByGroup performs different checks and eventually calls sdbCountByGroup which generated and executes SQL code for the appropriate database tables using the proper syntax. The results are returned in a list. To develop and test these two functions I used a utility.
[40] >>> TestCountGroupBy <<< st->text ==>{ "GroupID" : "0", "State" : "I", "Count" : "7160" }<== line: 60772 TestCountGroupBy <<< st->text ==>{ "GroupID" : "123", "State" : "I", "Count" : "72" }<== line: 60772 TestCountGroupBy <<< st->text ==>{ "GroupID" : "456", "State" : "I", "Count" : "48" }<== line: 60772 TestCountGroupBy <<< st->text ==>{ "GroupID" : "789", "State" : "I", "Count" : "24" }<== line: 60772
Seems that the database functions / methods are working and returning the same results we obtained by using Microsoft SQL Server Management Studio. So far, so good!
C:\>cascountbygroup -? cascountbygroup [iCAS TCP/IP] Retrieve a default set of BITFILE_TBL database counts from the specified iCAS. -a <ACCESS code> -d <DELAY in seconds> -h display this HELP screen -p <storage server PORT> -s <iCAS TCP/IP> -u <USER name> -v run the command in VERBOSE mode
The last screen capture shows the usage of the CLI under development. That is just one part of the code that needs to be written to be able to provide information for users. I believe in the KISS rule.
After spending time of getting and parsing arguments, we are ready to invoke the non-existing API. I wrote some code and of course it failed at compile time. The call was did not exist at the time.
Wrote the shell for the call to send the request and check if the call was successful. I could not check for returned data because no one was listening on the server side. Had to create a set of ancillary items in order to assign a code for the API and other information that is used to enable tracing on different modules. After the client side was ready, the iCAS server did not understand my new request.
On a side note, as you might know, I practice Deep Work or a version of it. I have been doing so for a few decades (before the book was published). Today is no exception. My two hour timer (I use Eye Defender) went off and following Pavlov’s classical conditioning I went up and prepared espresso as I always do after my first block of work. As I mentioned, my wife is out and about with her friend shopping. At home we have two espresso machines, one is on the very high end ($$$) and the other is a typical Italian coffee maker known as Moka pot which you will find in any Italian house hold. In my parents’ house, I grew up with a set of six pots of different size. The largest that I have seen is for 50 cups! That is a lot of coffee!
So far today, for breakfast, I had three cups mixed with milk, six cups of straight espresso (it was very good) and will have another three after lunch. That would add up to a full dozen. Since I started drinking coffee as soon as I dropped the milk bottle, I do not seem to get the jitters or miss sleep. In addition my blood pressure and heart rate are both on the low side.
My oldest son stopped by with some fruit and my wife just arrived home. I will finish this post tomorrow morning.
06/14/20 09:38:37 0x00003be4 - GUIDListThread <<< st->text ==>{ "GroupID" : "0", "State" : "I", "Count" : "7160" }<== line: 467 06/14/20 09:38:37 0x00003be4 - GUIDListThread <<< st->text ==>{ "GroupID" : "123", "State" : "I", "Count" : "72" }<== line: 467 06/14/20 09:38:37 0x00003be4 - GUIDListThread <<< st->text ==>{ "GroupID" : "456", "State" : "I", "Count" : "48" }<== line: 467 06/14/20 09:38:37 0x00003be4 - GUIDListThread <<< st->text ==>{ "GroupID" : "789", "State" : "I", "Count" : "24" }<== line: 467
At this point in time I am working editing the GUIDListThread() function / method in order to be able for the iCAS to collect the database information from a thread. The iCAS understood the call, dispatched a thread to process the request and we are in a new thread collecting the data. As you can tell from the log snippet, we are able to collect the data and display it in JSON format.
The next step will be to implement the SendJSONResults() function / method to send back to the client the results for this operation. We could make this function specific to this request, or better thinking in future terms, will design and implement a generic function which should work with JSON data in general.
C:\>cascountbygroup { "iCASCLI" : "CASCountByGroup", "ServerIP" : "192.168.1.110", "ServerPort" : "4444", "Counts" : [ { "GroupID" : "0", "State" : "I", "Count" : "7160" }, { "GroupID" : "123", "State" : "I", "Count" : "72" }, { "GroupID" : "456", "State" : "I", "Count" : "48" }, { "GroupID" : "789", "State" : "I", "Count" : "24" } ], "ReturnedValue" : "0" }
This last screen capture illustrates the CLI in operation. The JSON objects displayed matched what we have in the database.
At this point it seems that the code needs to be tested by a different developer or quality assurance (QA) group.
C:\>cascountbygroup { "iCASCLI" : "CASCountByGroup", "ServerIP" : "192.168.1.110", "ServerPort" : "4444", "Counts" : [ ], "ReturnedValue" : "-1002" }
This last screen capture illustrates the CLI when the iCAS server has not been started. The returnedValue contains a rather cryptic number. All errors codes are properly documented and associated with a short description. Perhaps the call could return WAR_CONNECTION_REFUSED or “Socket connection refused”. Will experiment and ask which of the three options is more appropriate.
I want to note that there were other functions / methods that I did not cover. The idea here is the general approach.
For obvious reasons, the code for this project will not be posted on my GitHub repository.
If you have comments or questions regarding this, or any other post in this blog, or if you would like for me to serve of assistance with any phase in the SDLC (Software Development Life Cycle) of a project associated with a product or service, please do not hesitate and leave me a note below. If you prefer, send me a private message using the following address: john.canessa@gmail.com. I will reply as soon as possible.
Keep on reading and experimenting. It is the best way to learn, refresh your knowledge and enhance your developer toolset!
One last thing, many thanks to all 1,141 subscribers to my blog!!!
Keep safe during the COVID-19 pandemic and help restart the world economy.
John
Twitter: @john_canessa