In our first two blogs on Enterprise SCCM Mashups, we looked at Challneges in using ConfigMgr data while the data is current (eliminating stale data) as well as ways to automate this process. The two blogs are listed below if you missed them
In today's segment, we'll discuss the process to create through a customer example:
We recently worked on developing a mashup based on normalized SCCM data for a large Fortune 50 company. With over 200,000 assets managed my SCCM the number of records to process was staggering (1/2 a billion records!). IT Project Managers were trying to use spreadsheets to manage different views of the data. The data in the spreadsheet went stale fast due to the effort required to extract usable data.
They came to BDNA with the goal of normalizing the SCCM data so that it would be easier to work with. After normalizing their data we identified an opportunity for a mashup between the normalized SCCM and their HR data. The normalized SCCM data was joined to the HR data using the system owner from SCCM (i.e., TopConsoleUser0 from v_GS_SYSTEM_CONSOLE_USAGE_MAXGROUP). Since there can be more than one entry for a machine we picked the user with the max total time on the machine. After some basic SQL to strip out the domain, we used the userid to join with the HR data set. Initially this HR data set was just a flat file we imported as a table to the database. However, due to the popularity of the report we migrated this process over to an automated feed delivered to the database server and BCP’d in. This process will be built into the automated job system complete with file watchers for the data feeds. Due to the size of the data for performance reasons we built into the process a step to materialize some of the views via well indexed reporting tables.
In addition to lashing the HR data with the current normalized SCCM data this application also enabled end users to augment the data with personal tagging and comments. This tagging mechanism allowed end users to markup particular software or hardware entities with user defined tags. Reports based of the counts of tags (e.g., a tag cloud) provided quick access to the specific hardware or software assets with a particular tag applied. In addition to tagging we built a search interface on top of the normalized data set to make search of the initial entry point to the data. Users could make a broad cut with a search term (give me all assets with Dell D620) then further refine the result set to just those with less than 1GB of RAM and Adobe Photoshop installed. It also enables search to be filtered by a specific HR code for a department they are responsible for which was one of the most exciting ways to be able to view the data. Finally they could then tag these machines with a user defined tag for quick reference and retrieval later (e.g., tagged NeedsMemory). Each week the underlying normalized SCCM data was refreshed but we had a mechanism in place to ensure the user applied tags to those entities (hardware/software) persisted after each refresh.
In summary this mashup combined normalized SCCM data, HR data, and User provided data to provide not only a tool providing previously difficult, if not unattainable views insight into the company’s current asset base, but also tool the IT Project Managers could Take Action with.
From this work a simple Enterprise Mashup evolved to help IT Project Managers escape from managing assets with stale spreadsheets
....In our next and final post to this series, we'll discuss some of the many use cases Enterpirse SCCM Mashups created for customers. A prodcutivity discussion on how IT become proactive in serving the business with SCCM data to take action....
by John Pusey, BDNA T.A.M.