SOLUTION DEVELOPMENT PROCESS
Given a scenario, select an effective data integration approach based on application requirements.
Identify the characteristics of the logical data model. Identify the characteristics of the physical data model.
Given a scenario, identify the tools that should be used to support performance analysis during the development and test phases. Given a scenario, verify that a proposed solution addresses the requirements.
SOLUTION DEVELOPMENT CONSIDERATIONS
Contrast the update processes for event driven, active data warehouse, and BI environments, and identify their impact on development choices. Given a scenario, identify the impact of row and set manipulation requests on application performance and on developer choices. Relate physical design solution development choices to data integration performance. Relate application solution development choices to data integration performance. Describe the role of table (horizontal) partitioning with respect to data integration. Describe the role of column (vertical) partitioning with respect to data integration. Given a scenario, identify the behavior of an identity column. Identify the tools and features available to implement and manage application security and privacy.
TRANSACTION PROCESSING AND LOCK MANAGEMENT
Identify the behavior of lock acquisition and release. Identify the potential for deadlocking to occur. Given a scenario, identify the effects of a system restart on utility locks. Given a scenario, identify the effects of transaction isolation on locking. Given a scenario, identify the lock considerations within an explicit transaction when using a locking modifier. Given a scenario, identify the lock considerations to handle retryable failures within an explicit transaction. Given a scenario, identify the benefits of using a multi-statement request vs. a multi-request transaction. Given a scenario, identify when ANSI mode needs to be used instead of Teradata mode.
SOLUTION DEVELOPMENT PLANNING PROCESS
When planning for implementation of a new solution, identify the information that should be provided to a capacity planning process. When planning for implementation of a new solution, identify the information that should be provided to a security planning process. When planning for implementation of a new solution, identify the information that should be provided to a database object planning process. Describe a process to promote from development, to test environment(s), and into production. Identify tools and processes available to analyze solution performance metrics that can also be used during testing.
ADVANCED SQL CONCEPTS
Given a CREATE TRIGGER statement, describe which events will take place upon firing the trigger, and in which sequence they will occur. Given a scenario, identify which kind of temporary table should be used. Given a SQL request which involves date and time manipulation, identify the result. Given a scenario with a type of outer join, identify the result of the join. Given a scenario, identify the impact of NULL processing on the result set. Given a scenario, identify the appropriate coding construct (stored procedure, macro, UDF, etc.). Given a recursive query, identify the potential performance issues. Given a scenario, identify when the use of Stored Procedures may be recommended. Describe the types and uses of Stored Procedure parameters and variables. Given a scenario, select the correct Ordered Analytic functions.
DATA INTEGRATION STRATEGIES
Given a scenario, identify which load operator should be used. Given a scenario, identify which load techniques should be used including availability and data integrity. Given a scenario, identify the appropriate export option. Given a scenario, identify the locking impact caused by the different load utilities. Given a set of requirements, identify the data integrity features that should be used. Given a scenario with triggers, RI constraints, or check constraints, identify the logical order of specified events that will take place. Determine the similarities and differences among Standard RI, Batch RI, and Soft RI. Identify considerations when loading BLOBs and CLOBs. Identify components for a dual system architecture. Describe the effect of various FastLoad/MultiLoad parameters. Describe the effect of various BTEQ parameters. Describe the effect of various Teradata Parallel Transporter parameters. Describe the effect of various TPump performance parameters. Identify the different methods used to handle duplicate rows. Identify how to eliminate unwanted duplicate rows from a multi-set table. Identify load considerations for set and multi-set tables.
ACCESS LAYER AND DATA DELIVERY STRATEGIES
Given a scenario, select an effective data access approach based on differing application requirements. Identify the characteristics of the semantic physical data model. Relate BI solution development choices to data access performance. Relate data integration solution development choices to data access performance. Describe the role of table (horizontal) partitioning with respect to access. Describe the role of column (vertical) partitioning with respect to access. Impact of application requirements on data access strategy, relationship between data integration options and data access performance, partitioning.
SOLUTION OPTIMIZATION
Given a scenario including table definitions, a query, and an EXPLAIN, identify opportunities for optimization. Describe the operations occurring when join terminology is seen in an EXPLAIN. Describe the operations occurring when key phrases are seen in an EXPLAIN. Given a scenario with table definitions and SQL, identify the opportunities to collect statistics that may improve performance. Given a scenario with table definitions and SQL, identify the opportunities to create secondary indexes that may improve performance. Given a scenario with table definitions and SQL, identify the opportunities to create join indexes that may improve performance. Given a scenario with table definitions, SQL, and data demographics, identify the best performing partitioning definition. Identify possible performance considerations of TPT Update and Stream Operators.