Monday, September 14, 2015

Are You Heading to Las Vegas for IBM Insight?

The IBM Insight conference is coming up and if you work with DB2, Big Data, analytics, data warehousing, or really, anything at all about enterprise data, then Las Vegas is the place to be the week of October 25 thru 29, 2015.

But what is IBM Insight? Well, you may remember it as the IBM Information on Demand conference, or IOD for short, Yes, IBM has renamed the conference yet again. I'm sure a lot of you can remember when there were a bunch of different "Technical Conferences" like the IMS Tech Conference or the DB2 Tech Conference. Those conferences, as well as several others, all got rolled up into IOD... which is now IBM Insight.


So with that bit of potential confusion out of the way, why should you attend this year's IBM Insight conference? Simply put, there is something of interest for everybody in a data-related profession. 

The conference boasts more than 1600 presentations that run the gamut from DB2 to IMS to Cognos to BI to Big Data to analytics to... well, you get the idea. You can experiences technical training, hands-on labs, and industry use cases delivered by over 1000 industry experts.

There will be multiple keynote sessions, focusing in on IBM's important data initiatives including:
  • Data Management
  • Advanced Analytics
  • Cloud
  • Hadoop & Spark
  • Content Management
  • Watson
Last year's event was attended by over 13,000 folks, which gives attendees a great opportunity to network with your peers and IBMers. Add to that the over 350 exhibitors at the Expo hall and you will be able to view, review, and examine all kinds of interesting software to help you manage your enterprise data.

I also want to promote my presentation at this year's conference, called Not Your Daddy's DB2! I'll talk about the changing landscape of the industry and how DB2 for z/OS has changed (and continues to change) to embrace modern IT. This session will be held on Wednesday, October 28, at 4:00 pm in the South Seas J room). If you're going to the conference, I hope to see you at my presentation.

And, as always, there will be plenty of time to kick back and relax after a long day of networking and learning. On Wednesday evening conference attendees will be treated to a free concert from Maroon 5... and, of course, everything that Las Vegas has to offer can be on your agenda, too.

So what are you waiting for? Register for the 2015 IBM Insight conference today... 

And if you register by September 18th you can get a discounted rate ($300 off).

Tuesday, September 08, 2015

Mainframe Cost Optimization



Modern businesses live in an age of financial austerity, cost containment and cutbacks. It is just a fact of life in many organizations that you need to be constantly vigilant for new ways to reduce costs. If your business relies on the mainframe -- and many of the biggest businesses do -- cost containment is of the utmost importance.
But how to cut costs? Many mainframe support groups are running thin in terms of people - so layoffs don't make a lot of sense. And the software that runs the business can't be cut. Management software supports the business systems, so cutting those may cost more than you save!

But there are things you can do. If you're interested in learning more about IBM MLC software costs, pricing/licensing, mainframe cost optimization and a solution to dynamically manage your system to reduce software costs be sure to take an hour out of your busy schedule and join me for a free webinar titled Mainframe Cost Optimization: Pricing, Licensing, the R4HA, and More! 

I'll be delivering this webinar on September 10, 2015 at 2pm EDT.

During this session I'll discuss:
  • The new mainframe pricing options including zCAP, CMP and MWP
  • The disparate moving parts of sub-capacity pricing including the R4HA
  • Methods for controlling R4HA intelligently to reduce monthly software costs.

So click here to register for the webinar and join me on September 10th. 

Friday, September 04, 2015

Influencing the DB2 Optimizer: Part 7 - Miscellaneous Additional Considerations

In this 7th, and final installment of this series on influencing the DB2 optimizer's access path choices, we will take a look at a couple of additional things to consider as you work toward improving your SQL performance.

Favor Optimization Hints Over Updating the DB2 Catalog  

Optimization hints to influence access paths are less intrusive and easier to implement than changing data in the DB2 Catalog. However, that does not mean that you should use optimization hints all the time! Do not use optimization hints as a crutch to arrive at a specific access path. Optimization hints are best used when an access path changes and you want to go back to a previous, efficient access path.

Limit Ordering to Avoid Scanning  

The optimizer is more likely to choose an index scan when ordering is important (ORDER BY, GROUP BY, or DISTINCT) and the index is clustered by the columns to be sorted.

Maximize Buffers and Minimize Data Access  

If the inner table fits in 2% of the buffer pool, nested loop join should be favored. Therefore, to increase the chances of nested loop joins, increase the size of the buffer pool (or decrease the size of the inner table, if ­possible).

Consider Deleting Non-uniform Distribution Statistics

Sometimes non-uniform distribution statistics can cause dynamic SQL statements to fluctuate dramatically in terms of how they perform. To decrease these wild fluctuations, consider removing the non-uniform distribution statistics from the DB2 Catalog.

Although dynamic SQL makes the best use of these statistics, the overall performance of some applications that heavily use dynamic SQL can suffer. The optimizer might choose a different access path for the same dynamic SQL statement, depending on the values supplied to the predicates. In theory, this should be the desired goal. In practice, however, the results might be unexpected. For example, consider the following dynamic SQL statement:

SELECT   EMPNO, LASTNAME
FROM     DSN81010.EMP
WHERE    WORKDEPT = ?

The access path might change depending on the value of WORKDEPT because the optimizer calculates different filter factors for each value, based on the distribution statistics. As the number of occurrences of distribution statistics increases, the filter factor decreases. This makes DB2 think that fewer rows will be returned, which increases the chance that an index will be used and affects the choice of inner and outer tables for joins.

These statistics are stored in the SYSIBM.SYSCOLDIST and SYSIBM.SYSCOLDISTSTATS tables and can be removed using SQL DELETE statements.

This suggested guideline does not mean that you should always delete the non-uniform distribution statistics. My advice is quite to the contrary. When using dynamic SQL, allow DB2 the chance to use these statistics. Delete these statistics only when performance is unacceptable. (They can always be repopulated later using RUNSTATS.)

Collect More Than Just the Top Ten Non-uniform Distribution Statistics   

If non-uniform distribution impacts more than just the top ten most frequently occurring values, you should use the FREQVAL option of RUNSTATS to capture more than 10 values. Capture only as many as will prove to be useful for optimizing queries against the non-uniformly distributed data.

DB2 Referential Integrity Use

Referential integrity (RI) is the implementation of constraints between tables so that values from one table (the parent) control the values in another (the dependent, or child). A referential constraint between a parent table and a dependent table is defined by a relationship between the columns of the tables. The parent table’s primary key columns control the values permissible in the dependent table’s foreign key columns. For example, in the sample table, DSN8810.EMP, the WORKDEPT column (the foreign key) must reference a valid department as defined in the DSN8810.DEPT table’s DEPTNO column (the primary key).

You have two options for implementing RI at your disposal: declarative and application. Declarative constraints provide DB2-enforced referential integrity and are specified by DDL options. All modifications, whether embedded in an application program or ad hoc, must comply with the referential constraints. Favor using declarative RI as DB2 will then be aware of the relationship and can use that information during access path optimization.

Application-enforced referential integrity is coded into application programs. Every program that can update referentially-constrained tables must contain logic to enforce the referential integrity. This type of RI is not applicable to ad hoc updates.

With DB2-enforced RI, CPU use is reduced because the Data Manager component of DB2 performs DB2-enforced RI checking, whereas the RDS component of DB2 performs ­application-enforced RI checking. Additionally, rows accessed for RI checking when using application-enforced RI must be passed back to the application from DB2. DB2-enforced RI does not require this passing of data, further reducing CPU time.

In addition, DB2-enforced RI uses an index (if one is available) when enforcing the referential constraint. In application-enforced RI, index use is based on the SQL used by each program to enforce the constraint.

If you must use application RI instead of declarative RI, be sure to also define referential constraints with the NOT ENFORCED keyword. In that case, the constraints will not be enforced by DB2, but will be documented in the DDL. And it gives DB2 additional information that can be used by the Optimizer for query optimization.

Summary

Hopefully this 7-part series on influencing DB2 access paths provided you with a nice overview of the options available to you and considerations for their use. If you are interested in learning more about SQL tuning and DB2 performance, consider purchasing the book from which this series was drawn: DB2 Developer's Guide 6th edition.

Happy SQL performance tuning!