PAL v2.0 Ideas

Coordinator
Jan 23, 2008 at 1:01 AM
Please let me know of ideas for PAL v2.0.

PAL is in need of a major overhaul – meaning it’s becoming increasingly difficult to diagnose bugs. Also, PAL is not usable on 64-bit due to its dependency on Microsoft Log Parser. So, I need to find a replacement. I’m considering using WPF (Windows Presentation Foundation) to create charts, but this takes on the dependency of the .NET Framework v3.0. This is the only option I have found so far where I can create charts for free unless any of you have other suggestions. The intended use of PAL is for workstations/laptops, so I don’t see any problems with taking this dependency on.

Here are some of the features and dependencies I have in mind:

  1. WPF: Taking on the dependency of WPF (.NET Framework v3.0)
  2. No more Log Parser: While I really like Log Parser it's not 64-bit compatible. Therefore, no dependency on Microsoft Log Parser.
  3. Engine Separation: Separation of the PAL engine into a separate assembly allowing other developers to use it in their applications.
  4. Wizard Conversion: Converting the PAL tool as it stands know into a Wizard.
  5. Perfmon Log Administration: Administration of perfmon logs (creation, collection, status, and analysis) on remote computers. This would allow “auto-fill” of the question variables for each of the logs.

Please let me know your thoughts.

Thank you,
Clint Huffman (clinth@microsoft.com)
Jan 27, 2008 at 8:27 PM
Hi Clint,

Thanks for this tool, i am using it more and more, this is really a great job!

I would like to have the capability to specify the report file name, this is quite useful in order to identify easily which analysis you have run, specially when using PAL in batch.

Thanks.

Pat
Jan 28, 2008 at 9:54 PM
Also, it would be nice to provide in the report the start and end date for which the analysis have been done, currently, in the section "Tool Parameter", it only provide the analysis interval.

Thanks
Feb 2, 2008 at 4:09 AM
Hi Clint,

These are excellent ideas. I really like the idea of separating the PAL engine. Modularity is key to extensibility. Since PAL 2.0 will be a major upgrade, what about building in new features that will allow PAL to go beyond just reading binary log files? Performance monitor has been enhanced a fair amount in Vista and Windows 2008 and I expect that there will be further changes to performance data gathering tools in Windows. One of the less used features is the option of sending perfmon data to a SQL server. I'm guessing that if PAL 2.0 doesn't use LogParser it will require some sort of SQL engine to parse the data from the logs. On the other hand, I don't think people will want to install SQL 2005 (even the express) edition on a workstation, so there must be another way. Also, if PAL 2.0 is going to include options to administer perfmon logging, it would be really helpful to have a tool that could create perfmon log profiles based on server type (SQL 2000/2005, IIS, Exchange, etc.) and then gather the appropriate data. You could have a PAL 2.0 dashboard that would go out and query data either directly from the servers or perhaps even consume data from System Center.

Also - PAL 1.x is a great tool as it stands. I find myself using it more all the time. I especially like the extensible performance profiles with performance thresholds and alerts that can be easily customized.

Thanks!
Coordinator
Feb 7, 2008 at 1:47 AM
I have considered using SQL Server many times and I have seen many other tools using it. With that said, adding SQL Server (express or full) would be a huge requirement hurdle for a modest tool, so I would rather not go that route unless I had no other choice.

Yes, Win 2008 and Vista have a great new user interface for looking at perf counters. I even had a bit of influence on it a few years ago. After I rewrite PAL in .NET, I'll consider adding the sysmon control in Vista to the tool, but for now most people just want to copy and paste the report data to other reports, so I'm sticking to the HTML reports for now.

Regarding gathering the appropriate log file data, the PAL tool does this today. You simply select the threshold file (IIS, SQL, BizTalk, etc.) and click the Export button. This will generate a perfmon log template with all of the counter data needed for the PAL analysis later.

I'm glad you like the tool and thanks for the feedback.

EdZ wrote:
Hi Clint,

These are excellent ideas. I really like the idea of separating the PAL engine. Modularity is key to extensibility. Since PAL 2.0 will be a major upgrade, what about building in new features that will allow PAL to go beyond just reading binary log files? Performance monitor has been enhanced a fair amount in Vista and Windows 2008 and I expect that there will be further changes to performance data gathering tools in Windows. One of the less used features is the option of sending perfmon data to a SQL server. I'm guessing that if PAL 2.0 doesn't use LogParser it will require some sort of SQL engine to parse the data from the logs. On the other hand, I don't think people will want to install SQL 2005 (even the express) edition on a workstation, so there must be another way. Also, if PAL 2.0 is going to include options to administer perfmon logging, it would be really helpful to have a tool that could create perfmon log profiles based on server type (SQL 2000/2005, IIS, Exchange, etc.) and then gather the appropriate data. You could have a PAL 2.0 dashboard that would go out and query data either directly from the servers or perhaps even consume data from System Center.

Also - PAL 1.x is a great tool as it stands. I find myself using it more all the time. I especially like the extensible performance profiles with performance thresholds and alerts that can be easily customized.

Thanks!

Feb 22, 2008 at 9:45 AM
Just found this tool, fantastic!

I agree with "PKSMania" The ability to specify the output path and filename for the htm report would make automatic generation of weekly reports a breeze using batch process

In my opinion, Don't go down the SQL Server route, this product is small and portable and great for adhoc and larger scale analysis. Placing a dependance on SQL is overkill for this type of utility.
Coordinator
Feb 22, 2008 at 6:54 PM
Specifying the output path is effectively the #1 requested feature. I'll see if I can get this feature added within the next few business days. if so, I'll post it as a new release.

Yeah, I don't plan on going the SQL Server route unless it becomes unavoidable.

I've been considering removing the dependency on Microsoft Log Parser, but eventually I want to add ETW, Event Logs, IIS logs, etc, to the PAL engine and the Log Parser tool is the best way to query all of these file types in a similar fashion. Also, I've been struggling with converting PAL to .NET. With all of this in mind, I think I'll stick with the VB.NET/VBScript hybrid (current architecture) for now.

I'll also focus on the globalizing PAL which means that PAL would be able to parse perfmon logs in other spoken languages such as Spanish and German.

Thanks for the feedback.
Feb 26, 2008 at 4:06 PM
Hi Clint,

Now I face an issue to compare more servers at once, so I need to do pivot tables and charts from common error factors from servers. This poses a question, how PAL outputs the results. A CSV format with no graphs of course is a good idea? Or teach PAL to consider multiple servers at once? Is there a need for it?

The funny and easy part, I can do the Hungarian translation, if anyone needs it. :)

Thanks for the time and effort, its very very valueable!!!

Best Regards,
Ferenc Matyas
Coordinator
Feb 27, 2008 at 9:46 PM
PAL has a feature where you can have it produce either HTML output (the report we typically see) or XML data. Use the argument /ISOUTPUTXML:True to have it generate the XML document. Once you have it, you can manipulate the data as you see fit. I don't know if I could have it generate CSV data simply because there is a hiearchy of data generated.
Coordinator
Feb 27, 2008 at 11:15 PM
Oh, regarding PAL being multiple server aware, I've been working on it. To make it work correctly, I have to get the counter list from the perfmon log, then allow the user to answer the questions for each of the servers in the log. I'm trying to do this process using multiple threads and for now .NET thread management is becoming difficult for me.
Apr 15, 2008 at 1:11 PM
Will you be adding an ODBC Counter Log Input as an option?

At the moment, we log our counters to a SQL database so we must use relog to convert to binary files for analysis. Tedious when we have many servers.

Great tool.

Ewan
Coordinator
Apr 23, 2008 at 10:58 PM
I don't have any plans to do that since I have a lot of other feature and bug fixes I need to get into the tool first. With that said, anyone is welcome to modify the PAL.vbs VBScript which does all of the processing. Just let me know that portions you updated.


ewancourtney wrote:
Will you be adding an ODBC Counter Log Input as an option?

At the moment, we log our counters to a SQL database so we must use relog to convert to binary files for analysis. Tedious when we have many servers.

Great tool.

Ewan


Nov 7, 2008 at 12:35 PM
Clint,

Great tool!!

Could be possible add other analysis outside of Perfmon Objects??? Because, some performance testing tool generate CSV files too (as Jmeter), and could be helpful analyze it with the same tool that with analyze the  server counters....
I refer to, add other New analysis like "Time Response" for Category "Custom", the Counter field could be the Label generated in the CSV (by the Performance testing tool) ant the thresholds could be the requirements that we have....

I don't know if could be possible, but expand the tool functionality.

Please, let me know if I miss something or I'm not being clear.

Regards,

Jose
Coordinator
Nov 7, 2008 at 6:07 PM
That's a great idea, but I'm not quite sure how I would implement that kind of functionality other than for specific cases. The problem is dealing with all of the different schemas. PAL is based on the perfmon log schema in which the first column is the Date/Time and all other columns are the counter names. All we would need to do is create tools that convert one type of schema to the perfmon log schema. I support BizTalk which would make converting these log files *very* easy. Tell you what, let's first see if any contributors out there want to take on the idea of writing PowerShell scripts to convert these log file schemas to perfmon log schemas. Depending on how that goes, after I finish PAL v2.0, I might just publish a BizTalk web service allowing people to upload their custom log and have it convert it to a perfmon log.

Also, I'm trying to integrate different types of logs such as Event Logs and IIS (W3C) logs into PAL v2.0.

Thank you.
Nov 7, 2008 at 6:44 PM
Thanks for reply!!!

ok, it can wait, I only want add value to the tool, because I see that have a good potential.

About the schemes, the CSV generated by the Performance tools that I used always have this columns:
timeStamp - elapsed -label-responseCode-responseMessage-threadName-bytes-grpThreads-URL-Latency-....others
similar to the Perfmon schema.

Thanks for hear users Ideas!!!

Regards,

Jose



Feb 3, 2009 at 2:17 AM
Some thoughts on independence of MS Log Parser and SQL Server. Except for any use of ODBC/DSNs and parsing the binary *.blg perfmon logs, perhaps we should look into the feasibility of using an embedded database. That way we get independence from the log parser and still be able to use SQL and SQL Server like access without the requirement of having SQL Server.

There are a few out there and some are free, though there may be limitations. One that's like SQL Server for .NET is VistaDB Express, http://www.vistadb.net/vistadb3/vistadb-editions.aspx.

As for parsing the input files, the CSV files can be parsed to some extent using ADO and the MS text/CSV database provider and perhaps imported into the embedded database. To parse the binary files the same way, we'd have to relog the binary files to CSV first.
Coordinator
Feb 16, 2009 at 6:15 AM
Thank you, but I am able to get all of the functionality I need out of PowerShell v2.0 without the need for a database so far.
Feb 18, 2009 at 9:32 PM
In case you need a database engine, I think the most appropriate is SQLite. http://sqlite.phxsoftware.com/. System.Data.SQLite is an open source public domain full featured ADO.NET 2.0 provider and embedded SQL database engine that doesn't require installation nor configuration, all you need is to copy System.Data.SQLite.dll into your app directory. It supports x86 and x64 processor architectures. Bulk inserting data into a SQLite database takes less time than bulk inserting data into a SQL Server database even when doing it in a SQL Server minimally logged operation. It is ACID compliant, and supports most SQL92 standard.

Regards:

Jesús López
Mar 13, 2009 at 1:32 AM
feature request for 2.0:

export or save analysis reports in PDF and/or RTF, or CHM (compiled HTML help) formats. I don't quite like the HTML report since you need to copy/move/rename along the subfolder of graphs with the HTML file. It's no big deal as the # of files and filesizes are small but having an all in one document file for viewing is nice. For PAL v1.x I have to resort to third party tools to convert the HTML output to PDF, etc.

suggestion on perfmon admin: use of command line utils (typeperf.exe, logman.exe) or WMI via .NET to collect and manage logs within PAL. In case this wasn't considered.
Coordinator
Mar 17, 2009 at 4:53 AM
HTML is the most compatible and easiest for me to work with. I tried doing MHT (single HTML files) for awhile, but the table of contents kept breaking.
I'll definetly look into doing PDF and XPS report formats. Thanks for the suggestions.

Yes, I've been wanting to have PAL do data collection as well. Collecting counters remotely is too much overhead, so WMI and .NET collections are no good. Logman is the only viable way I know of right now that allows me to create perfmon logs local to all of the remote servers. PAL ships with several scripts that I wrote that help with perfmon log administration. They are in the Scripts directory of the PAL installation directory. I use those scripts a *lot*. ;-).
Aug 18, 2009 at 6:18 PM

I would love to see it ingest any perflog file and analyze all of the counters in it.  Right now a custom template needs to be made for counters not in the default templates.

Aug 19, 2009 at 2:27 AM
d8adork wrote:

I would love to see it ingest any perflog file and analyze all of the counters in it.  Right now a custom template needs to be made for counters not in the default templates.

Good suggestion, which I also would like. But this brings up another question. If it could ingest any perflog with whatever counters it has, how would it perform the threshold and trend analysis? Those are defined by the template files. Getting the min, max, and average counter values and making graphs is easy, but trend and threshold calculations are not easy to do without some way to specify them. So unless you don't care for those, a template or something similar would still be needed.

Coordinator
Aug 24, 2009 at 5:29 AM
d8adork wrote:

I would love to see it ingest any perflog file and analyze all of the counters in it.  Right now a custom template needs to be made for counters not in the default templates.

 mangar00 is correct. Some kind of threshold must be defined. Unfortunately, we don't have thresholds for all counter and likely never will - there are just too many counters. If you just want all of the counters displayed in graphs, then try out Bling.ps1. Bling.ps1 is a proof of concept tool I wrote for PAL v2.0. Bling.ps1 will read a perf counter log and just create charts of all of the counters. You can get it at:
http://pal.codeplex.com/Release/ProjectReleases.aspx?ReleaseId=21260

Oct 8, 2009 at 6:35 PM

Version 1.3.5 has a nice feature to export the template file to HTML. This is very useful to create a performance monitor on XP or 2003. But performance monitor on 2008 takes an XML file as a template. It would be nice if the next version could export to the 2008 format.

Coordinator
Oct 8, 2009 at 9:13 PM

Correct and I agree. Windows Server 2008 uses XML as for data collector sets. I honestly haven't been able to figure out the schema of the data collector set XML yet in order to create them on my own. If anyone out there knows how to create them or understands the schema, then we need your help.

Oct 8, 2009 at 10:29 PM

I recall seeing another thread about perflogs for Vista/Windows Server 2008. It mentions way to set up perflog for the new OSes by config perflog or import HTML template to create perflog on XP/2003 machine and then from Vista/2008 machine, use logman.exe to export the XP/2003 perflog settings remotely into XML file.

Perhaps someone could do a number of these export tests and then compare the original HTML templates with the XML templates produced from the exports to figure out the schema mapping.

I'd try that when I have some time. I'd also like to know  the XML schema.

Oct 12, 2009 at 3:33 PM

Maybe I already said this and just can't find the post now that I'm struggling with it again...

I would like to see the report include the start and end time of the analyzed log, the interval time and the command line.

 

Not only would this help me, it will help those I might share the reports with.

 

Thanks for this EXCELLENT tool.

Coordinator
Oct 12, 2009 at 5:18 PM

Great suggestion. I will add it to PAL v2.0 this week and give you credit in the build notes. Thank you.

Coordinator
Oct 15, 2009 at 4:05 AM

Okay, I added the log time range to the report, but getting the original command line passed into the script might be hard. I'll research that. Here is what it looks like so far.

 

Tool Parameters:


Name Value
Log(s): SamplePerfmonLog.blg
AnalysisInterval: 20 second(s)
Threshold File: QuickSystemOverview.xml
AllCounterStats: False
Log Time Range: 08/14/2007 15:40:38 - 08/14/2007 15:50:38
TotalMemory: 1
NumberOfProcessors: 1
Oct 15, 2009 at 3:38 PM

That looks perfect.

As far as adding the command line suggestion, I end up re-running about every other report with slightly different parameters. I could just get into the habit of copying the command line before generating the report. Which reminds me, the persistent command window is almost useless as the default buffer is always to short to see much and the command line is not stored in the command history.

 

Oct 15, 2009 at 6:35 PM

treestryder is right about the persistent command window but I think it is still useful in gauging how far the processing has gone based on things like the current counter(s) being analyzed, etc.

Coordinator
Nov 2, 2009 at 7:52 AM

FYI. I have released PAL v2.0 as an alpha version technology preview. Consider using it only if you are already familar with PowerShell. Please let me know your thoughts. You have been a great community to work with! :-)

Jan 15, 2010 at 3:22 PM

Could you create a template for Terminal Servers? The System Overview is very complete but its missing if you have a TS. Good utility!!!

Coordinator
Jan 15, 2010 at 11:27 PM

A TS threshold file is high in demand, but I need to have a content owner to do the proper research on the thresholds and to own it. So far, no one has stepped up.

Jan 16, 2010 at 1:17 AM

Well, until someone steps up to take ownership for the threshold file, in the interim, perhaps someone could create a sample TS thresholds template, not meant for production use and others can customize the template accordingly as needed.

I'm not familiar with managing a TS and the performance aspects of it, so what is needed in the interim is ideas on what things are of interest to monitor & report in a TS environment. Once you have that, building a thresholds file for that is trivial, and the thresholds can always be customized as needed.