🚀 Launch Special: $29/mo for life --d --h --m --s Claim Your Price →
1Z0-1110
Coming Soon
Expected availability announced soon

This course is in active development. Preview the scope below and create a free account to be notified the moment it goes live.

Notify me
1Z0-1110 Oracle Coming Soon

OCI Data Science Professional

The Oracle Cloud Infrastructure Data Science Professional certification teaches candidates how to build, train, evaluate, and deploy machine‑learning models on OCI, covering data pipelines, integration, and MLOps best practices.

90
Minutes
50
Questions
65
Passing Score
$245
Exam Cost

Who Should Take This

Data engineers, data scientists, and DevOps professionals who have at least two years of experience with cloud platforms and Python‑based machine‑learning workflows should pursue this certification. It validates their ability to design end‑to‑end ML pipelines on OCI, optimize model performance, and implement robust MLOps processes, positioning them for senior roles in cloud‑native AI projects.

What's Covered

1 ML Fundamentals on OCI
2 Model Training and Evaluation
3 Model Deployment and Serving
4 Data Processing and Integration
5 MLOps and Model Management

What's Included in AccelaStudy® AI

Adaptive Knowledge Graph
Practice Questions
Lesson Modules
Console Simulator Labs
Exam Tips & Strategy
20 Activity Formats

Course Outline

62 learning goals
1 ML Fundamentals on OCI
2 topics

ML Workflow Design

  • Design end-to-end ML workflows on OCI covering data ingestion, feature engineering, model training, evaluation, and deployment.
  • Implement OCI Data Science notebook sessions with compute shapes, conda environments, and custom container runtimes for development.
  • Architect data pipelines using OCI Data Flow (Spark), Data Integration, and Object Storage for ML feature preparation at scale.
  • Evaluate ML problem formulations to select appropriate algorithms, evaluation metrics, and data splitting strategies for OCI.
  • Configure ML Workflow Design with appropriate settings, policies, and parameters for a production deployment scenario.
  • Assess ML Workflow Design implementations against best practices to identify gaps and recommend improvements for production readiness.
  • Architect ML Workflow Design solutions with scalability patterns, capacity planning, and growth accommodation for long-term sustainability.

Data Preparation

  • Implement data exploration and profiling using OCI Data Science notebooks with pandas, matplotlib, and the ADS SDK.
  • Design feature engineering pipelines with data transformation, encoding, scaling, and feature selection for model training.
  • Architect data labeling workflows using OCI Data Labeling for creating annotated datasets for classification and detection tasks.
  • Analyze data quality issues including missing values, class imbalance, outliers, and data drift for model reliability.
  • Evaluate Data Preparation alternatives and tradeoffs to recommend the optimal approach for given performance and cost constraints.
  • Formulate Data Preparation governance frameworks with policies, standards, and compliance monitoring for organizational alignment.
2 Model Training and Evaluation
2 topics

Model Training

  • Implement model training using scikit-learn, XGBoost, LightGBM, and TensorFlow/PyTorch on OCI Data Science compute instances.
  • Design distributed training architectures using OCI Data Science jobs with GPU shapes for large-scale deep learning workloads.
  • Architect hyperparameter tuning workflows using the ADS SDK, manual search, grid search, and Bayesian optimization strategies.
  • Implement AutoML capabilities using Oracle AutoML and the ADS SDK for automated model selection and pipeline optimization.
  • Analyze model training results to compare algorithms, identify overfitting/underfitting, and select optimal model configurations.
  • Deploy Model Training with integration to monitoring, logging, and alerting services for operational visibility.
  • Analyze Model Training configurations to identify security vulnerabilities, performance bottlenecks, and optimization opportunities.

Model Evaluation

  • Design model evaluation frameworks with holdout sets, cross-validation, and stratified sampling for robust performance estimation.
  • Implement evaluation metrics calculation: classification (accuracy, precision, recall, F1, AUC-ROC), regression (MSE, RMSE, MAE, R²).
  • Architect model explainability using SHAP, LIME, and permutation importance for interpretable ML predictions on OCI.
  • Evaluate model fairness, bias detection, and responsible AI compliance using the ADS MLX module and Oracle model evaluation tools.
  • Design enterprise-grade Model Evaluation architectures incorporating high availability, disaster recovery, and security requirements.
  • Apply Model Evaluation configuration patterns to meet specific business requirements including compliance and governance needs.
3 Model Deployment and Serving
2 topics

Model Deployment

  • Design model deployment architectures using OCI Data Science model deployment with real-time HTTP endpoints and auto-scaling.
  • Implement model serialization, artifact packaging, and conda environment specification for reproducible model deployment.
  • Architect model versioning with the Model Catalog for lifecycle management, lineage tracking, and model governance.
  • Evaluate deployment configurations to optimize inference latency, throughput, and compute cost for production serving.
  • Implement Model Deployment following best practices for security, performance, and reliability in Oracle Cloud Infrastructure Data Science Professional.
  • Diagnose Model Deployment issues by analyzing metrics, logs, and configuration to determine root causes and remediation steps.

Batch and Edge Inference

  • Implement batch prediction workflows using OCI Data Science jobs and Data Flow for large-scale offline inference processing.
  • Design model A/B testing and shadow deployment patterns for comparing model versions in production environments.
  • Architect model monitoring for prediction drift detection, performance degradation alerts, and automated retraining triggers.
  • Analyze model serving metrics to identify latency spikes, error rates, and capacity planning requirements for ML endpoints.
  • Analyze Batch and Edge Inference configurations to identify security vulnerabilities, performance bottlenecks, and optimization opportunities.
  • Recommend Batch and Edge Inference optimization strategies balancing performance, cost, operational complexity, and risk management.
4 Data Processing and Integration
2 topics

OCI Data Flow

  • Implement Apache Spark applications on OCI Data Flow for large-scale data processing, transformation, and feature computation.
  • Design Data Flow pipeline architectures with job scheduling, parameterization, and Object Storage integration for ETL workflows.
  • Architect streaming data processing with Spark Structured Streaming on Data Flow for real-time feature computation and analytics.
  • Analyze Data Flow job performance to optimize Spark configurations, partitioning, and resource allocation for cost efficiency.
  • Architect OCI Data Flow solutions with scalability patterns, capacity planning, and growth accommodation for long-term sustainability.
  • Configure OCI Data Flow with appropriate settings, policies, and parameters for a production deployment scenario.

Data Integration and Catalog

  • Implement OCI Data Integration for visual ETL pipeline design with source, transformation, and target operators.
  • Design OCI Data Catalog for metadata management, data discovery, and data lineage tracking across OCI data assets.
  • Architect data lakehouse patterns combining Object Storage, Data Flow, Data Catalog, and Autonomous DW for analytics.
  • Evaluate data pipeline architectures to identify bottlenecks, data quality issues, and optimization opportunities.
  • Explain how to troubleshoot common issues with Data Integration and Catalog including error messages, logs, and diagnostic procedures.
  • Evaluate Data Integration and Catalog alternatives and tradeoffs to recommend the optimal approach for given performance and cost constraints.
5 MLOps and Model Management
2 topics

MLOps Practices

  • Design MLOps workflows with automated training, validation, deployment, and monitoring using OCI services and DevOps pipelines.
  • Implement CI/CD for ML with automated model testing, performance validation gates, and production deployment automation.
  • Architect model registry management with lifecycle stages (development, staging, production, archived) and access control policies.
  • Evaluate MLOps maturity to identify gaps in automation, reproducibility, and governance for ML system operations.
  • Compare MLOps Practices deployment patterns to determine the best architecture for meeting availability and scalability requirements.
  • Design enterprise-grade MLOps Practices architectures incorporating high availability, disaster recovery, and security requirements.

Model Governance

  • Design model governance frameworks with documentation standards, approval workflows, and compliance audit trails on OCI.
  • Implement model reproducibility with versioned datasets, training parameters, and environment specifications in Model Catalog.
  • Architect model security with IAM policies, private endpoints, and VCN isolation for securing ML infrastructure and data.
  • Analyze ML system risks including model staleness, data pipeline failures, and serving infrastructure vulnerabilities.
  • Plan Model Governance migration and modernization strategies including phased rollout, testing, and rollback procedures.
  • Implement Model Governance following best practices for security, performance, and reliability in Oracle Cloud Infrastructure Data Science Professional.

Scope

Included Topics

  • All domains in the Oracle Cloud Infrastructure Data Science Professional (1Z0-1110) exam guide.
  • Topics: ML Workflows, OCI Data Science, Model Training, Feature Engineering, Model Deployment, MLOps, Data Processing.
  • Oracle Cloud Infrastructure services, tools, and best practices relevant to this certification.
  • Scenario-based problem solving at the Professional level.

Not Covered

  • Topics outside the official exam guide scope.
  • Programming language specifics and code-level implementation details.
  • Specific pricing values and promotional offers that change over time.
  • Third-party products and non-Oracle services beyond basic integration awareness.

Official Exam Page

Learn more at Oracle

Visit

1Z0-1110 is coming soon

Adaptive learning that maps your knowledge and closes your gaps.

Create Free Account to Be Notified

Trademark Notice

Oracle®, Java®, MySQL®, and all Oracle certification marks are registered trademarks of Oracle Corporation. Oracle does not endorse this product.

AccelaStudy® and Renkara® are registered trademarks of Renkara Media Group, Inc. All third-party marks are the property of their respective owners and are used for nominative identification only.