1. What is the difference between expdp and impdp?
- Answer:
- expdp (Data Pump Export): This utility is used to export database objects (tables, schemas, or entire databases) into a dump file. The export process can run in parallel and can be fine-tuned for performance.
- impdp (Data Pump Import): This utility is used to import data from dump files created by expdp into a database. Like expdp, impdp also supports parallel execution and advanced filtering options.
- Key Difference: While expdp is used for exporting data from Oracle databases, impdp is used for importing data back into Oracle databases.
2. What are the advantages of using Data Pump (expdp/impdp) over traditional exp/imp?
- Answer:
- Performance: Data Pump utilities support parallel processing, which significantly speeds up both export and import operations.
- Network Mode: Data Pump can export/import directly between databases over the network, bypassing the need for dump files.
- Flexibility: More granular control over the export and import process (e.g., filtering tables, schemas, or partitions).
- Incremental Exports: Data Pump supports incremental exports, allowing for only changes (new or modified data) to be exported since the last export.
- Job Monitoring: Data Pump offers real-time monitoring and logging of operations.
3. Explain the concept of "PARALLEL" in Data Pump and how it improves performance.
- Answer:
- The PARALLEL parameter allows multiple worker processes to run in parallel during an export or import operation. By dividing the task among multiple processes, the overall time for data transfer is reduced.
- expdp and impdp can perform operations faster, especially with large datasets or highly partitioned tables, by utilizing multiple CPUs or cores.
- The PARALLEL value specifies the number of parallel workers that should be launched to handle the job. The higher the number, the more parallelism you achieve (subject to system resources).
4. What is the role of the DIRECTORY parameter in expdp/impdp?
- Answer:
- The DIRECTORY parameter specifies the directory object in the database where dump files will be written (for expdp) or read from (for impdp). This directory must be a valid directory object in Oracle, and it must be accessible to the Oracle Database Server. The directory path must be created and granted appropriate permissions to the Oracle user executing the job.
- For example:
5. How can you perform a schema-level export using expdp?
- Answer:
To export an entire schema, you can use the following command:
- SCHEMAS: This specifies the schema(s) to export. You can list multiple schemas by separating them with commas.
6. What is the EXCLUDE parameter in Data Pump and how is it used?
- Answer:
- The EXCLUDE parameter is used to exclude certain objects from the export or import operation. For example, you can exclude tables, indexes, or constraints from the export.
- It is useful when you need to exclude specific objects to reduce the dump file size or to avoid exporting unnecessary objects.
- Example of excluding tables:
7. What is the INCLUDE parameter in Data Pump and how is it used?
- Answer:
- The INCLUDE parameter allows you to include specific types of objects in an export or import job. This is the opposite of EXCLUDE.
- You can use it to focus only on specific database objects, such as tables, schemas, or indexes.
- Example of including only tables in the export:
8. What is the FLASHBACK_TIME parameter in expdp?
- Answer:
- The FLASHBACK_TIME parameter allows you to perform an export as it appeared at a specific point in time. This is useful for exporting consistent data from a database as it was during a certain time, even if the data is being modified during the export process.
- For example:
9. How do you perform a transportable tablespace export/import with expdp/impdp?
- Answer:
- For transportable tablespaces, Data Pump can export and import entire tablespaces, reducing the time and complexity of moving large datasets between databases.
- First, you need to set the TRANSPORTABLE parameter to ALWAYS or NEVER during the export:
- Then, use the impdp utility to import the tablespace into the target database:
10. Explain the REMAP_SCHEMA parameter in impdp.
- Answer:
- The REMAP_SCHEMA parameter allows you to map the schema from the source database to a different schema in the target database during an import operation. This is useful when the schema name on the source and target databases are different.
- For example:
11. What is the purpose of the ACCESS_METHOD parameter in expdp/impdp?
- Answer:
- The ACCESS_METHOD parameter controls how the Data Pump job reads or writes the data. By default, Data Pump uses direct path when possible, but if the direct path is not available, it falls back to the conventional path.
- The value for ACCESS_METHOD can be:
- DIRECT_PATH: Uses direct path (faster) if possible.
- CONVENTIONAL: Uses the conventional export/import method (slower).
- Example:
12. How do you monitor the progress of an expdp or impdp job?
- Answer:
- You can monitor the progress of an expdp or impdp job by querying the DBA_DATAPUMP_JOBS and DBA_DATAPUMP_SESSIONS views.
- Example:
- Additionally, the STATUS parameter in the LOGFILE will display progress during the job execution.
13. How can you restart a failed Data Pump job?
- Answer:
- If a Data Pump job fails, you can restart the job from where it left off using the RESTART parameter.
- Example:
No comments:
Post a Comment
Was this Post Helpful?
Feel free to suggest your opinions in the comments section!