Showing posts with label Data. Show all posts
Showing posts with label Data. Show all posts

Thursday, August 25, 2011

The Preliminary Specification Part X (PLM Part V)


This post will represent my last kick at trying to sell the “user vision” as something more then just a technological toy or an interesting thing someone wanted but no one would ever use. I could certainly see how the user vision could be perceived as being something that would fall in either of those two classifications. However, if either of those were the results of what I thought the technology would be used for. I would be guilty of mis-communicating what I think the value in having the user vision implemented in the People, Ideas & Objects marketplace modules was.

To have the interface operational and people using it to browse as they do with web browsers today. Would be somewhat limiting in terms of the potential of what I see. With so much data and information available now and particularly in the future, how do we represent this data’s use in a manner that is usable to the user? The first requirement of the data is that it has to be reliable and have meaning. If the data and information that is contained within the user vision and marketplace modules are populated by public and private databases for well information, lease data, vendor and supplier inventory, scheduling calendars, and other unstructured data. Then producers, suppliers and others could conduct actionable events on that data and information.

When I say that information was populated into the system these should be live links that could include some of the information from a firms accounting systems. The point that I am attempting to make is that the information would be of value and therefore impute meaning into the “marketplace” modules where the “user vision” was employed. This would take the technology aspect of the device and make it the secondary reason for its use. The primary reason for the user visions use would be the business purpose of the user. The reason the user would continually return to use the marketplace would be the accuracy of the data and information that they find, and the interactions within the marketplace.

To develop a new systems for the oil and gas industry is an opportunity that I think is a once in a life time, maybe a once in a century opportunity. To think that we will use these systems in the manner that we use systems today underestimates the possibilities. The user interface is the area where the most innovation will occur in terms of how people will interact with larger volumes of data. These and other types of issues should be considered in the Preliminary Specification. I offer the “user vision” and these posts to initiate the discussion and expand the scope of what is possible in terms of the research that is undertaken. I thank you for your patience.

For the industry to successfully provide for the consumers energy demands, it’s necessary to build the systems that identify and support the Joint Operating Committee. Building the Preliminary Specification is the focus of People, Ideas & Objects. Producers are encouraged to contact me in order to support our Revenue Model and begin their participation in these communities. Those individuals that are interested in joining People, Ideas & Objects can join me here and begin building the software necessary for the successful and innovative oil and gas industry.

Please note what Google+ provides us is the opportunity to prove that People, Ideas & Objects are committed to developing this community. That this is user developed software, not change that is driven from the top down. Join me on the People, Ideas & Objects Google+ Circle and begin building the community for the development of the Preliminary Specification. Email me here if you need an invite.

Monday, August 22, 2011

The Preliminary Specification Part VIII (PLM Part III)


In the past two posts regarding the Preliminary Specifications Output. We have noted two research projects for the community to undertake. The first research project relates to all three of the “marketplace” modules and is to determine if the user interface would incorporate elements of the “user vision”. The second research project relates to the Petroleum Lease Marketplace module and is to determine if the data and information is to be stored at the Joint Operating Committee level or at the producer. This would enable many changes, the most important of which was a revised division of labor and specialization in the administration of oil and gas.

The implications of the decisions from these research projects are far reaching. The entire Preliminary Specification is a substantial opportunity for the industry to participate in a significant redesign of how the industry operates. By realigning all of the frameworks of the hierarchy, those being the compliance and governance, with the legal, financial, operational decision making, cultural, communication, innovation and strategic frameworks of the Joint Operating Committee we are leaving no stone unturned. We have the opportunity to address everything and make it far more natural in how it operates.

The budget for the Preliminary Specification has taken on the controversial size of $100 million. We begin to see how that size matters in the type of work that needs to be done. These two research projects are only small subsets of the work that needs to be undertaken, but they are budgeted for. It is this type of research that will benefit the system as we move forward. These research decisions made at this early stage will have the dramatic impact on the system when it is developed. Change after the Preliminary Specification becomes difficult and very costly. Now is the time to address these difficult points, and now is the time to address them appropriately with the appropriate research budgets.

I don’t want to get into the details of what is required in each of the research projects. Once you start thinking it becomes quite a large issue. And that is why they will be addressed in the Preliminary Specification. For the industry to successfully provide for the consumers energy demands, it’s necessary to build the systems that identify and support the Joint Operating Committee. Building the Preliminary Specification is the focus of People, Ideas & Objects. Producers are encouraged to contact me in order to support our Revenue Model and begin their participation in these communities. Those individuals that are interested in joining People, Ideas & Objects can join me here and begin building the software necessary for the successful and innovative oil and gas industry.

Please note what Google+ provides us is the opportunity to prove that People, Ideas & Objects are committed to developing this community. That this is user developed software, not change that is driven from the top down. Join me on the People, Ideas & Objects Google+ Circle and begin building the community for the development of the Preliminary Specification. Email me here if you need an invite.

Sunday, August 07, 2011

McKinsey on Big Data


In this paper McKinsey are writing about what they call “Big Data” which they define as “datasets whose size is beyond the ability of typical database software tools to capture, store and analyse.” Although their recommendations are limited to this definition I find that they apply to the growth in data in general. In particular McKinsey’s recommendations may be applicable to the data sets that are in use in business and particularly the growth of data that may occur in the very near future. Specifically in oil and gas it is anticipated that the demand for data will continue to grow. People, Ideas & Objects have anticipated this growth and based our architecture on a Technical Vision that can accommodate that growth.

McKinsey’s “Big Data” recommendations are listed here. The full report is available here. It is a comprehensive report, however I do recommend reading at least the Executive Summary.

  1. Data have swept into every industry and business function and are now an important factor of production.
  2. Big Data creates value in several ways.
    1. creating transparency
    2. enabling experimentation to discover needs, expose variability, and improve performance.
    3. segmenting populations to customize actions.
    4. replacing / supporting human decision making with automated algorithms.
    5. innovating new business models, products and services.
  3. Use of Big Data will become a key basis of competition and growth for individual firms.
  4. Use of Big Data will underpin new waves of productivity growth and consumer surplus. 
  5. While the use of Big Data will matter across sectors, some sectors are poised for greater gains. 
  6. There will be a shortage of talent necessary for organizations to take advantages of Big Data. 
  7. Several issues will have to be addressed to capture the full potential of Big Data.

To deal with the associated issues of data People, Ideas & Objects Draft Specification provides two modules. The Analytics and Statistics Module which is looking at the producer firms data and Performance Evaluation Module which looks at the Joint Operating Committee data.

For the industry to successfully provide for the consumers energy demands, it’s necessary to build the systems that identify and support the Joint Operating Committee. Building the Preliminary Specification is the focus of People, Ideas & Objects. Producers are encouraged to contact me in order to support our Revenue Model and begin their participation in these communities. Those individuals that are interested in joining People, Ideas & Objects can join me here and begin building the software necessary for the successful and innovative oil and gas industry.

Please note what Google+ provides us is the opportunity to prove that People, Ideas & Objects are committed to developing this community. That this is user developed software, not change that is driven from the top down. Join me on the People, Ideas & Objects Google+ Circle and begin building the community for the development of the Preliminary Specification. Email me here if you need an invite.

Tuesday, April 14, 2009

Follow on to yesterday's post.

Google's new tool, the Google Insights for Search was highlighted yesterday. I was thinking all through the day what a tool like that would be like in the oil and gas industry. I felt as a result, I didn't attach enough emphasis on the implications of having the data available for the type of analysis that is available with the tools I described in the Draft Specification .

First lets go back to the situation at hand today. Data is scattered throughout the organization in a number of informal and unknown spreadsheets, databases and file cabinets. Their are production departments, accounting departments and exploration departments that use very similar data and store this data in their own file cabinets and electronically. As we know departments only speak to each other at the higher levels of the organizations. Hence the lack of communication and shared data remains an unfulfilled promise.

A lot of this data is not structured or captured in a centralized database. The advantage of using a database is that it allows different users to perceive the data in different ways. Much of this problem has been addressed by the various POSC and PPDM data models. However, not many of the software vendors or companies have been able to implement the data models in the optimal way. Consider also that polymorphic behavior which is a cornerstone of the Java Programming Language. Allows users to perceive different methods or ways of processing the data. You begin to see the flexibility and opportunity that is missing with these poor data implementations.

When we talk about the Security & Access Control Module in the People, Ideas & Objects we begin to see the importance of getting all of this data organized and accessible by the right people.

Imagine what it would be like if People were able to access the same data in the same format for the entire Joint Operating Committee. And this would apply to the entire industry. Where the employees and contractors that are authorized access to the data are all trained in the generic industry data models. Everyone would know where the data they need is, and would be able to access it from their clients in an authorized fashion and analyse it effectively for new information.

Lastly, the Technical Vision of People, ideas & Objects. Essentially laying out for the means to have an explosion of data in every corner of the producer's domain. This is not as a result of the application being built, this data will become real on its own. The tools to make it so are now readily available and a matter of time before its generally available. If the data is not organized today, when and how will it be organized in the future with exponentially more data, risk and complexity.

Technorati Tags:

Monday, April 13, 2009

Google Insights for Search

Google recently released a new service called Google Insights for Search. (Click on the title of this blog post to be taken to the announcement on the Google Research blog). Once you go to the site (http://www.google.com/insights/search/#) it will subsequently show up as an application in your Google Account. There is a related .pdf written by Hal Varian and Hyunyong Choi that details how to use this new and interesting tool.

In the Draft Specification we have two modules that are somewhat related to this product. Both the Analytics & Statistics Module and the Performance Evaluation Module are user defined tools that allow detailed analysis of the data in People, Ideas & Objects. The data is the key attribute to these modules. With the People, Ideas & Objects Technical Vision expecting the data volumes to explode through IPv6, Java and Wireless access to the data in a known format. Data analysis tools like Analytics & Statistics and Performance Evaluation Modules will be the key to obtaining value from it. That is essentially what Google has done with the Google Trends data, published it through a known API and interface.

The purpose in allowing users to access formatted data is the key to the value in using the tools and understanding the data. How many firms have data scattered across many departments and on individuals machine? How much of this data is available through known and trusted access mechanisms? What tools are available for people to interpret and analyze the data. We have the opportunity to embed the R application into the Modules and other opportunities.

I highly recommend downloading the R application, the .pdf from Google on how to use Google Insight for Search and Google Trends data. Experimentation with this opportunity will provide you with the beginning of understanding the opportunities we will make available to our users. And please, join me here .

Technorati Tags:

Sunday, March 15, 2009

The Internet @ twenty.

There is a video presentation of Tim Berners-Lee at the TED Conference in February 2009. (Click on the title of this entry for the video.) In the video he discusses his actions and concerns regarding his development of the World Wide Web. Many have claimed, including vice - president Al Gore, to have invented the web, but only Tim Berners Lee is recognized as doing so. Nighted by the Queen, Sir Berners-Lee notes in this video that the web will soon be 20 years of age, and that indeed it was recognized on Saturday March 14, 2009.

Berners-Lee worked for CERN, the group responsible for the Large Hadron Collider. His work at CERN demanded that a solution to the large volume of visiting physicists who were working there. Specifically Berners-Lee was concerned about the loss of kowledge through the large turnover at CERN. I think this parallels the potential loss of knowledge in the oil and gas industry. One that is addressed in the two modules of the People, Ideas & Objects Research & Capabilities and Knowledge & Learning

Berners-Lee goes on to talking about the future of the Internet and the role that data will have in that future. Asking what would happen if linked data were more readily available. And simply stating people would use it and make new and innovative ideas from it. Scientists in many disciplines have theories and ideas, but no data. He noted the efforts of dbpedia and I would add Freebase.com.

In People, Ideas & Objects two modules address the need for additional data. The Performance Evaluation (Joint Operating Committee) and Analytics & Statistics Modules (Firm focused.) Each working from one of the two different perspectives of the People, Ideas & Objects application. Please join me here

Technorati Tags:

Tuesday, July 29, 2008

In a word, Whiplash.

U.S. Energy Secretary James R. Schlesinger (1977 - 1979) once stated that energy is framed by two emotions, complacency and fear. There is an air of complacency since the oil price has fallen over $20. How distant the problems of earlier this month seem. It almost makes sense to fill the tank again.

How much of this price change is the result of the inventory builds in the U.S. is unknown at this time. Over the past two weeks we have seen exceptionally large builds as it is rumored that U.S. consumption dropped substantially. The two weeks of inventory build was preceded by an unusually large draw down of inventories the week before. I hope this is a sign of the effect of higher prices on consumer demand, but I think we may also be in for a bit of a surprise.

In Supply Chain Management there is a phenomenon known as whiplash. It is an appropriate phrase as the analogy to whiplash is appropriate. You learn the intricacies of this phenomenon by conducting a simulation of a beer supply chain. The retailer, distributor, warehouse and brewery are each represented by four individuals. The objective is to keep the appropriate amount of beer in stock to satisfy your companies needs.

Starting off the game with minimal supply in each location you begin by passing information confidentially from one area of the chain to the immediate neighbors. What happens is as the supply demands fluctuate the effect on inventory begins to switch between the two extremes. One moment you have an excess, which reduces your next order, then you are faced with a draw down of inventory and the supply never recovers. The phenomenon once it is in the supply chain is very difficult to remove. The variance in inventory at all four locations are providing absolutely useless information.

If as I suspect, whiplash has entered the U.S. inventory of energy, then we may see the resumption of demand and a significant draw down in inventory. Leading to price increases and so on...

Technorati Tags:

Monday, September 17, 2007

Peak Oil's turning point?

A variety of news and information that makes the Peak Oil issue somewhat more real then yesterday. First up is Shell's former Chairman Lord Oxburgh declares the following points. Lord Oxburgh says the industry "has it's head in the sand" and warned:

We may be sleepwalking into a problem which is actually going to be very serious and it may be too late to do anything about it by the time we are fully aware.
and
And once you see oil prices in excess of $100 or $150 a barrel the alternatives simply become more attractive on price grounds if on no others.
The Association for the Study of Peak Oil published some comprehensive analysis on the US oil import data. Of the many countries that export oil to the US, what will be sustainable for the long term? This analysis answers that question and the following two graphs reflect this analysis.














and

















Our friends at the Energy Bulletin have noted such luminaries as Lee Raymond, formerly of Exxon Mobil and the National Petroleum Council and former Energy Secretary James R. Schlesinger quietly reflecting on probabilities and possibilities of Peak Oil. Another excellent resource highlighted by Energy Bulletin is the report from the Department of Energy report "Peaking of World Oil Production: Recent Forecasts."

Technorati Tags: , , ,


Sunday, April 01, 2007

Enterprise search and security.

In the User Vision I noted the ability to search the domain of the user. A far easier thing to say then it is to do. Consider for a moment the number of companies within the industry. Consider the number of Joint Operating Committee's (JOC's) they participate in, and then consider the number of users that will be involved in preparing and using corporate data. Access to the user's domain when they may fulfill different roles in different JOC's for different client companies, one begins to see the issue regarding their ability to search for their information.

The idea that search and security would be linked would have seemed oxymoronic a few years ago. How could search maintain and build upon the security of a Service Oriented Architecture (SOA) such as the one being written about in this blog. Firstly the top priority of any development and operation of any application of this type is the quality, integrity and security of the data that is being used by producers and users. At the same time search will become an indispensable competitive tool for any oil and gas producer. Access through extensive, state of the art search technologies is a critical requirement for the oil and gas producer and user. Another critical issue is the users expectation of near single shot relevancy being provided by search giant Google. A little review of the features of the technological architecture as it is proposed here is as follows.

Authorized access will be granted to users through the world wide web. Recall that the use of a private network using IPv6 provides enhanced security that is inherent in the protocol. The producers will also access their applications from the Grid that is owned and operated as a service by Sun Microsystem. Hosting of the Genesys application by Sun provides a level of third party reliability and security that is necessary for the application. Genesys will focus on research and development of systems, not compete with Sun on infrastructure.

Each producer will have a virtualized Solaris environment on the Grid, Ingress Open Source Database Instance, and Genesys Application Server all operating side by side with other producers, possibly on the same processor. This will provide, and it stands to reason that firewall and other security requirements are already in place, each producer will access their, and only their application and data. In addition each virtualized environment will have a Google Enterprise Search Appliance maintaining the access, control list, search security, and search index's. Information about Google's Enterprise Search Appliance can be found here, and their Enterprise blog here. Information on Sun's virtualization of Solaris is here.

Deciding between money, time, and / or quality, as with any system development you are entitled to two of these objectives at the expense of the third. In the case of search security, and security in general time and quality will be at the expense of money. Although the Solaris user and Ingress user accounts are free as they are open source, they do command large fees for services of operation, the Google Search Appliance is also relatively expensive.

I found a website and consulting firm that has dedicated themselves to enterprise search and security. Idea Engineering have a newsletter that provides the necessary discussion of many of the issues companies will need to address in the future. I am highlighting a series of articles they wrote in a series of newsletters that provide value for the readers here. The series of articles are here, here and here.

A couple of the assumptions that I am operating under should be stated explicitly. We have design freedom in terms of how the application is built. Secondly, we have the cost of 1 Million Instruction Per Second (MIPS) of processing power is now $0.01 (processor costs only), enabling intense, yet affordable processing capability. Think encryption, virtualization of each producer each employee, heavy and multiple indexing algorithms and access control lists, processing demand will be very high. Add the unique perspectives that are part of this blog like Military Command Structures, Single Sign On (SSO) which is a necessary feature.

Lastly the manner that I see this application being built is through the ultimate users. What I would like to see happen is that a discussion around these points fill in some of the detail and ferret out the finer points and issues. It is the users application and their involvement is being called on for this critical issue.

Technorati Tags: , , , , ,

Thursday, March 22, 2007

Another security concern.

I wrote earlier about my concern for the security risks associated with the new Zune and web phones, and that includes Apple's new iPhone. These large hard drives with wireless connectivity could be accessing corporate data without anyone knowing. The need to encrypt your network is critical these days, but it is also necessary to store your data in encrypted form. A very difficult task for a company to do. This new threat that I am writing about today will also be mitigated by high level encryption on the network and storage. I recommend Sun Microsystems Elliptical encryption technology.

The other product that has popped up that concerns me is Adobe's Apollo platform. In an attempt to "pick up where Java has left off" they have created a "run-time" that enables web applications to operate also as desktop applications. The manner in which they do this is of course is enabling Apollo to have access to the lower level operating system functions. This is where Java has drawn the line and it is the point where no Java application can access the data and systems of a client machine. Apollo takes this security precaution, throws into the garbage, and offers any user a tool that will enable anyone to provide web and desktop applications without knowing what is really going on. Behind the GUI application, another part of the same application may be copying data, destroying data or what ever it is they may want. It literally has nothing to stop the user from being entertained or distracted while it goes on its merry way through your client machine and network.

The key to solving this problem is to not download the "run-time" Difficult when you have many users. The "run-time" is necessary to run the "p" code that the applications will be distributed as. "p" code is not full binaries, but also not software code. As a result the user can not look at the code and determine what it's actually doing. Without the "runtime" when a web site uses some Apollo functionality, it will be unable to morph itself down into to the operating system level, disabling the feature of the website.

The other problem with this is the popularity that this platform will have. The demonstration that I saw was of an eBay Apollo application and included credit card numbers and access to the file systems. The users need to get the work done, and more and more that is all that the they are concerned about, and hence they will use what works, irrespective of the consequences of what they don't know or don't understand. The only people that I think are going to be interested in writing applications for Apollo are the ones who are currently writing viruses. The Apollo "run-time" doesn't let them in, it invites them in. No software vendor that is concerned with the security and reliability of the client systems will write to the Apollo "run-time", therefore it may simply be a matter of selection that the users are disallowed the use or download the Apollo "run-time". But then again, a good virus writer could probably install the "run-time" for the user.

Technorati Tags: , , ,

Friday, March 16, 2007

A new tool.

Marissa Mayer of Google just announced "gapminder" for a preview look. What an unbelievable tool. The world will not be seen through the data elements, but through tools like gapminder. The address to this demo is...

http://tools.google.com/gapminder/

Technorati Tags: , , , ,