Python Exception Handling: Graceful Failure the Pythonic Way

Here’s the thing about Python and error handling - Python doesn’t just make it easy, it makes it elegant. While other languages treat exceptions like necessary evils, Python embraces them as a natural part of program flow. The Python philosophy is “it’s easier to ask for forgiveness than permission,” and nowhere is this more obvious than in exception handling.

Don’t worry about memorizing every exception type right away. The goal here is to understand Python’s approach to handling errors and see how clean, readable error handling can make your code more robust and maintainable. You’ll pick up the patterns with practice.

Why Pythonic Exception Handling Matters

Before we dive into the syntax, let me tell you why Python’s approach to exceptions is huge for your career. Python is the go-to language for data processing, web development, automation, and scientific computing. When you can write code that fails gracefully with clear, helpful error messages, people notice. It’s the difference between scripts that work on your machine and applications that work in production.

The Pythonic Philosophy: EAFP

Python follows EAFP - “Easier to Ask for Forgiveness than Permission.” Instead of checking if something might go wrong, you try it and handle the exception if it does. This makes code cleaner and often faster:

 1# Pythonic (EAFP)
 2def get_user_age(user_data):
 3    try:
 4        return int(user_data['age'])
 5    except (KeyError, ValueError, TypeError) as e:
 6        return None
 7
 8# Less Pythonic (LBYL - Look Before You Leap)
 9def get_user_age_verbose(user_data):
10    if not isinstance(user_data, dict):
11        return None
12    if 'age' not in user_data:
13        return None
14    if not isinstance(user_data['age'], (str, int)):
15        return None
16    try:
17        return int(user_data['age'])
18    except ValueError:
19        return None

The first version is cleaner, more readable, and handles edge cases you might not have thought of.

Python Exception Hierarchy - The Basics

Python’s exception hierarchy is designed to be intuitive. Here are the ones you’ll encounter most often:

 1def demonstrate_common_exceptions():
 2    """Show the most common exceptions you'll encounter."""
 3    print("=== Common Python Exceptions ===")
 4    
 5    # FileNotFoundError - trying to open a file that doesn't exist
 6    try:
 7        with open('nonexistent_file.txt', 'r') as f:
 8            content = f.read()
 9    except FileNotFoundError as e:
10        print(f"File not found: {e}")
11    
12    # ValueError - wrong value type
13    try:
14        number = int("not_a_number")
15    except ValueError as e:
16        print(f"Value error: {e}")
17    
18    # KeyError - dictionary key doesn't exist
19    try:
20        data = {'name': 'Alice'}
21        age = data['age']  # Key doesn't exist
22    except KeyError as e:
23        print(f"Key error: {e}")
24    
25    # IndexError - list index out of range
26    try:
27        items = [1, 2, 3]
28        item = items[10]  # Index doesn't exist
29    except IndexError as e:
30        print(f"Index error: {e}")
31    
32    # TypeError - wrong type
33    try:
34        result = "hello" + 5  # Can't add string and int
35    except TypeError as e:
36        print(f"Type error: {e}")
37
38demonstrate_common_exceptions()

File I/O Exception Handling - The Pythonic Way

Python’s approach to file operations is beautifully clean. Let’s start with the most common scenario: reading a file that might not exist.

Basic File Reading with Pythonic Exception Handling

  1import os
  2from pathlib import Path
  3from typing import List, Optional
  4
  5def read_file_basic(filename: str) -> List[str]:
  6    """Read a file and return lines, handling exceptions gracefully."""
  7    print(f"=== Reading {filename} ===")
  8    
  9    try:
 10        with open(filename, 'r', encoding='utf-8') as f:
 11            lines = f.readlines()
 12        
 13        # Clean up whitespace (Pythonic!)
 14        lines = [line.rstrip('\n\r') for line in lines]
 15        print(f"Successfully read {len(lines)} lines from {filename}")
 16        return lines
 17        
 18    except FileNotFoundError:
 19        print(f"File not found: {filename}")
 20        print("Make sure the file exists and the path is correct.")
 21        return []
 22        
 23    except PermissionError:
 24        print(f"Permission denied: {filename}")
 25        print("Check if you have read permissions for this file.")
 26        return []
 27        
 28    except UnicodeDecodeError as e:
 29        print(f"Encoding error in {filename}: {e}")
 30        print("Try specifying a different encoding.")
 31        return []
 32        
 33    except OSError as e:
 34        print(f"OS error reading {filename}: {e}")
 35        print("This could be a disk error or the file is locked.")
 36        return []
 37
 38def read_file_robust(filename: str) -> List[str]:
 39    """More robust file reading with pathlib and detailed error handling."""
 40    print(f"\n=== Robust Reading {filename} ===")
 41    
 42    file_path = Path(filename)
 43    
 44    # Pythonic pre-flight checks
 45    if not file_path.exists():
 46        print(f"File doesn't exist: {filename}")
 47        return []
 48    
 49    if not file_path.is_file():
 50        print(f"Path is not a file: {filename}")
 51        return []
 52    
 53    # Check file size before reading (prevent memory issues)
 54    try:
 55        file_size = file_path.stat().st_size
 56        if file_size > 100 * 1024 * 1024:  # 100MB limit
 57            print(f"Warning: File is large ({file_size:,} bytes)")
 58            response = input("Continue reading? (y/n): ").lower()
 59            if response != 'y':
 60                return []
 61    except OSError as e:
 62        print(f"Cannot check file stats: {e}")
 63        return []
 64    
 65    try:
 66        # Use pathlib's read_text for simple cases
 67        content = file_path.read_text(encoding='utf-8')
 68        lines = content.splitlines()
 69        
 70        print(f"Successfully read {len(lines)} lines from {filename}")
 71        return lines
 72        
 73    except UnicodeDecodeError:
 74        # Try different encodings
 75        for encoding in ['latin-1', 'cp1252', 'iso-8859-1']:
 76            try:
 77                content = file_path.read_text(encoding=encoding)
 78                lines = content.splitlines()
 79                print(f"Successfully read {len(lines)} lines using {encoding} encoding")
 80                return lines
 81            except UnicodeDecodeError:
 82                continue
 83        
 84        print(f"Could not decode {filename} with any common encoding")
 85        return []
 86        
 87    except OSError as e:
 88        print(f"Error reading file: {e}")
 89        return []
 90
 91
 92def create_sample_file(filename: str) -> None:
 93    """Create a sample file for testing."""
 94    sample_data = [
 95        "Line 1: This is a test file",
 96        "Line 2: Created for exception handling demo", 
 97        "Line 3: Each line has different content",
 98        "Line 4: To show how file reading works",
 99        "Line 5: With proper exception handling"
100    ]
101    
102    try:
103        with open(filename, 'w', encoding='utf-8') as f:
104            f.write('\n'.join(sample_data))
105        print(f"Created sample file: {filename}")
106    except OSError as e:
107        print(f"Error creating sample file: {e}")
108
109
110def demonstrate_file_reading():
111    """Demonstrate different file reading approaches."""
112    # Test with non-existent file
113    read_file_basic("does_not_exist.txt")
114    
115    # Create and read a test file
116    create_sample_file("test.txt")
117    read_file_robust("test.txt")
118
119# Run the demo
120demonstrate_file_reading()

File Writing with Pythonic Exception Handling

Python’s context managers (with statements) make file writing much cleaner than other languages:

  1import tempfile
  2import shutil
  3from pathlib import Path
  4from typing import List, Union
  5
  6def write_file_basic(filename: str, data: List[str]) -> bool:
  7    """Write data to a file with basic exception handling."""
  8    print(f"=== Writing to {filename} ===")
  9    
 10    try:
 11        with open(filename, 'w', encoding='utf-8') as f:
 12            f.write('\n'.join(data))
 13        
 14        print(f"Successfully wrote {len(data)} lines to {filename}")
 15        return True
 16        
 17    except PermissionError:
 18        print(f"Permission denied writing to {filename}")
 19        return False
 20        
 21    except OSError as e:
 22        print(f"Error writing file: {e}")
 23        return False
 24
 25def write_file_safe(filename: str, data: List[str]) -> bool:
 26    """Write file safely with atomic operations and backups."""
 27    print(f"\n=== Safe Writing to {filename} ===")
 28    
 29    file_path = Path(filename)
 30    
 31    # Create parent directories if needed (Pythonic!)
 32    file_path.parent.mkdir(parents=True, exist_ok=True)
 33    
 34    # Check available space
 35    try:
 36        free_space = shutil.disk_usage(file_path.parent).free
 37        estimated_size = sum(len(line.encode('utf-8')) for line in data)
 38        
 39        if free_space < estimated_size * 2:  # Safety margin
 40            print(f"Warning: Low disk space. Available: {free_space:,} bytes")
 41            
 42    except OSError as e:
 43        print(f"Warning: Cannot check disk space: {e}")
 44    
 45    # Create backup if file exists
 46    backup_path = None
 47    if file_path.exists():
 48        backup_path = file_path.with_suffix(file_path.suffix + '.backup')
 49        try:
 50            shutil.copy2(file_path, backup_path)
 51            print(f"Created backup: {backup_path}")
 52        except OSError as e:
 53            print(f"Warning: Could not create backup: {e}")
 54    
 55    # Write to temporary file first (atomic operation)
 56    try:
 57        with tempfile.NamedTemporaryFile(
 58            mode='w', 
 59            encoding='utf-8',
 60            dir=file_path.parent,
 61            prefix=file_path.stem + '_',
 62            suffix='.tmp',
 63            delete=False
 64        ) as temp_file:
 65            temp_path = Path(temp_file.name)
 66            temp_file.write('\n'.join(data))
 67        
 68        # Atomic move (as atomic as the OS allows)
 69        temp_path.replace(file_path)
 70        
 71        print(f"Successfully wrote {len(data)} lines to {filename}")
 72        
 73        # Clean up backup on success
 74        if backup_path and backup_path.exists():
 75            backup_path.unlink()
 76            
 77        return True
 78        
 79    except OSError as e:
 80        print(f"Error writing file: {e}")
 81        
 82        # Restore backup if write failed
 83        if backup_path and backup_path.exists():
 84            try:
 85                shutil.copy2(backup_path, file_path)
 86                print("Restored from backup")
 87                backup_path.unlink()
 88            except OSError as restore_error:
 89                print(f"Error restoring backup: {restore_error}")
 90        
 91        # Clean up temp file
 92        if 'temp_path' in locals() and temp_path.exists():
 93            temp_path.unlink()
 94            
 95        return False
 96
 97def append_to_file(filename: str, content: str) -> bool:
 98    """Append content to a file."""
 99    print(f"\n=== Appending to {filename} ===")
100    
101    try:
102        with open(filename, 'a', encoding='utf-8') as f:
103            f.write('\n' + content)
104        
105        print(f"Successfully appended to {filename}")
106        return True
107        
108    except OSError as e:
109        print(f"Error appending to file: {e}")
110        return False
111
112def demonstrate_file_writing():
113    """Demonstrate file writing with exception handling."""
114    test_data = [
115        "First line of data",
116        "Second line with more content", 
117        "Third line: numbers 123, 456, 789",
118        "Fourth line: special chars !@#$%^&*()",
119        "Fifth line: final content"
120    ]
121    
122    # Test basic writing
123    write_file_basic("output1.txt", test_data)
124    
125    # Test safe writing with directory creation
126    write_file_safe("subdirectory/output2.txt", test_data)
127    
128    # Test appending
129    append_to_file("output1.txt", "This line was appended later")
130
131demonstrate_file_writing()

Advanced Pythonic Exception Patterns

Python’s exception handling really shines with these advanced patterns:

  1import json
  2import csv
  3from pathlib import Path
  4from typing import List, Dict, Any, Optional
  5from contextlib import contextmanager
  6
  7class FileProcessingError(Exception):
  8    """Custom exception for file processing errors."""
  9    
 10    def __init__(self, message: str, filename: str, line_number: int = 0, original_error: Exception = None):
 11        super().__init__(message)
 12        self.filename = filename
 13        self.line_number = line_number
 14        self.original_error = original_error
 15    
 16    def __str__(self):
 17        base_message = f"FileProcessingError: {super().__str__()}"
 18        context = f" (file: {self.filename}"
 19        
 20        if self.line_number > 0:
 21            context += f", line: {self.line_number}"
 22        
 23        context += ")"
 24        
 25        if self.original_error:
 26            context += f" - Original error: {self.original_error}"
 27            
 28        return base_message + context
 29
 30def parse_numbers_from_file(filename: str) -> List[int]:
 31    """Parse integers from a file, one per line, with detailed error reporting."""
 32    
 33    file_path = Path(filename)
 34    if not file_path.exists():
 35        raise FileProcessingError("File not found", filename)
 36    
 37    numbers = []
 38    
 39    try:
 40        with open(filename, 'r', encoding='utf-8') as f:
 41            for line_number, line in enumerate(f, 1):
 42                line = line.strip()
 43                
 44                # Skip empty lines and comments
 45                if not line or line.startswith('#'):
 46                    continue
 47                
 48                try:
 49                    number = int(line)
 50                    numbers.append(number)
 51                except ValueError as e:
 52                    raise FileProcessingError(
 53                        f"Invalid number format: '{line}'",
 54                        filename,
 55                        line_number,
 56                        e
 57                    ) from e
 58                    
 59    except OSError as e:
 60        raise FileProcessingError("Error reading file", filename, original_error=e) from e
 61    
 62    return numbers
 63
 64@contextmanager
 65def safe_file_operation(filename: str, operation: str = "processing"):
 66    """Context manager for safe file operations with cleanup."""
 67    print(f"Starting {operation} on {filename}")
 68    
 69    try:
 70        yield filename
 71        print(f"Successfully completed {operation} on {filename}")
 72    except Exception as e:
 73        print(f"Error during {operation} on {filename}: {e}")
 74        raise
 75    finally:
 76        print(f"Finished {operation} on {filename}")
 77
 78def process_file_with_retry(filename: str, max_retries: int = 3) -> Optional[List[int]]:
 79    """Process a file with retry logic and exponential backoff."""
 80    import time
 81    
 82    print(f"=== Processing {filename} with retry logic ===")
 83    
 84    for attempt in range(1, max_retries + 1):
 85        try:
 86            with safe_file_operation(filename, f"attempt {attempt}"):
 87                numbers = parse_numbers_from_file(filename)
 88                print(f"Successfully parsed {len(numbers)} numbers: {numbers}")
 89                return numbers
 90                
 91        except FileProcessingError as e:
 92            print(f"Attempt {attempt} failed: {e}")
 93            
 94            if attempt == max_retries:
 95                print("All retry attempts exhausted")
 96                print(f"Final error details: {e}")
 97                return None
 98            else:
 99                # Exponential backoff
100                sleep_time = 2 ** (attempt - 1)
101                print(f"Retrying in {sleep_time} seconds...")
102                time.sleep(sleep_time)
103    
104    return None
105
106def process_multiple_files(filenames: List[str]) -> Dict[str, Any]:
107    """Process multiple files and collect results."""
108    print("\n=== Processing Multiple Files ===")
109    
110    results = {
111        'successful': {},
112        'failed': {},
113        'total_numbers': 0
114    }
115    
116    for filename in filenames:
117        try:
118            # Create test files for demo
119            create_number_file(filename)
120            
121            numbers = parse_numbers_from_file(filename)
122            results['successful'][filename] = numbers
123            results['total_numbers'] += len(numbers)
124            print(f"✓ Processed {filename}: {len(numbers)} numbers")
125            
126        except FileProcessingError as e:
127            results['failed'][filename] = str(e)
128            print(f"✗ Failed {filename}: {e}")
129        except Exception as e:
130            results['failed'][filename] = f"Unexpected error: {e}"
131            print(f"✗ Unexpected error in {filename}: {e}")
132    
133    # Summary
134    print(f"\nSummary:")
135    print(f"  Successful files: {len(results['successful'])}")
136    print(f"  Failed files: {len(results['failed'])}")
137    print(f"  Total numbers processed: {results['total_numbers']}")
138    
139    return results
140
141def create_number_file(filename: str) -> None:
142    """Create a test file with numbers for demonstration."""
143    
144    sample_content = {
145        "numbers1.txt": ["# Valid numbers", "42", "123", "", "456", "789"],
146        "numbers2.txt": ["100", "200", "# Another comment", "300"], 
147        "numbers3.txt": ["# This file has an error", "111", "not_a_number", "333"]
148    }
149    
150    content = sample_content.get(filename, ["42", "123", "456"])
151    
152    try:
153        with open(filename, 'w', encoding='utf-8') as f:
154            f.write('\n'.join(content))
155    except OSError as e:
156        print(f"Error creating test file {filename}: {e}")
157
158def demonstrate_advanced_handling():
159    """Demonstrate advanced exception handling patterns."""
160    
161    # Test with valid file
162    create_number_file("valid_numbers.txt")
163    process_file_with_retry("valid_numbers.txt")
164    
165    # Test with file containing invalid data
166    process_file_with_retry("numbers3.txt", max_retries=2)
167    
168    # Test multiple file processing
169    test_files = ["numbers1.txt", "numbers2.txt", "numbers3.txt"]
170    results = process_multiple_files(test_files)
171    
172    return results
173
174# Run the advanced demo
175demonstrate_advanced_handling()

Pythonic Exception Handling Best Practices

Here are the patterns that make Python exception handling elegant and maintainable:

The Pythonic Way (Do This)

  1import json
  2from pathlib import Path
  3from typing import Any, Dict, Optional
  4
  5class PythonicExceptionHandling:
  6    """Examples of Pythonic exception handling patterns."""
  7    
  8    def load_config(self, filename: str) -> Dict[str, Any]:
  9        """Load configuration with specific exception handling."""
 10        
 11        # Use pathlib (Pythonic!)
 12        config_path = Path(filename)
 13        
 14        try:
 15            return json.loads(config_path.read_text())
 16        except FileNotFoundError:
 17            # Provide sensible defaults
 18            print(f"Config file {filename} not found, using defaults")
 19            return self.get_default_config()
 20        except json.JSONDecodeError as e:
 21            # Give detailed error information
 22            raise ValueError(f"Invalid JSON in {filename}: {e}") from e
 23        except PermissionError:
 24            # Handle specific error cases
 25            raise PermissionError(f"Cannot read config file {filename}")
 26    
 27    def get_default_config(self) -> Dict[str, Any]:
 28        """Return default configuration."""
 29        return {
 30            'host': 'localhost',
 31            'port': 8000,
 32            'debug': False
 33        }
 34    
 35    def safe_int_conversion(self, value: Any, default: int = 0) -> int:
 36        """Convert value to int with fallback (Pythonic EAFP)."""
 37        try:
 38            return int(value)
 39        except (ValueError, TypeError):
 40            return default
 41    
 42    def process_user_data(self, data: Dict[str, Any]) -> Dict[str, Any]:
 43        """Process user data with multiple exception types."""
 44        
 45        processed = {}
 46        
 47        try:
 48            # Chain operations that might fail
 49            processed['age'] = self.safe_int_conversion(data['age'])
 50            processed['name'] = str(data['name']).strip()
 51            processed['email'] = self.validate_email(data['email'])
 52            
 53        except KeyError as e:
 54            raise ValueError(f"Missing required field: {e}") from e
 55        
 56        return processed
 57    
 58    def validate_email(self, email: str) -> str:
 59        """Simple email validation."""
 60        if '@' not in email:
 61            raise ValueError("Invalid email format")
 62        return email.lower().strip()
 63    
 64    def read_data_file(self, filename: str) -> Optional[List[Dict[str, Any]]]:
 65        """Read data file with comprehensive error handling."""
 66        
 67        file_path = Path(filename)
 68        
 69        # Early return for missing file (Pythonic!)
 70        if not file_path.exists():
 71            return None
 72        
 73        try:
 74            with file_path.open('r', encoding='utf-8') as f:
 75                if filename.endswith('.json'):
 76                    return json.load(f)
 77                elif filename.endswith('.csv'):
 78                    return list(csv.DictReader(f))
 79                else:
 80                    # Fallback: read as lines
 81                    return [{'line': line.strip()} for line in f]
 82                    
 83        except (json.JSONDecodeError, csv.Error) as e:
 84            print(f"Error parsing {filename}: {e}")
 85            return None
 86        except UnicodeDecodeError as e:
 87            print(f"Encoding error in {filename}: {e}")
 88            return None
 89        except OSError as e:
 90            print(f"OS error reading {filename}: {e}")
 91            return None
 92
 93# Good: Using else clause with try-except
 94def copy_file_pythonic(source: str, destination: str) -> bool:
 95    """Copy file with Pythonic exception handling."""
 96    
 97    try:
 98        shutil.copy2(source, destination)
 99    except FileNotFoundError:
100        print(f"Source file not found: {source}")
101        return False
102    except PermissionError:
103        print(f"Permission denied copying to: {destination}")
104        return False
105    except OSError as e:
106        print(f"OS error during copy: {e}")
107        return False
108    else:
109        # This runs only if no exception occurred
110        print(f"Successfully copied {source} to {destination}")
111        return True
112    finally:
113        # This always runs (for cleanup if needed)
114        print("Copy operation completed")
115
116# Good: Exception chaining with 'from'
117def parse_config_with_context(filename: str) -> Dict[str, Any]:
118    """Parse config file with exception chaining."""
119    
120    try:
121        with open(filename, 'r') as f:
122            config_text = f.read()
123    except OSError as e:
124        raise ConfigurationError(f"Cannot read config file {filename}") from e
125    
126    try:
127        return json.loads(config_text)
128    except json.JSONDecodeError as e:
129        raise ConfigurationError(f"Invalid JSON in {filename}") from e
130
131class ConfigurationError(Exception):
132    """Custom exception for configuration errors."""
133    pass

Anti-Patterns (Don’t Do This)

 1# Bad: Bare except clause
 2def bad_example_1():
 3    try:
 4        # Some operation
 5        risky_operation()
 6    except:  # Never do this!
 7        pass  # Silent failure is the worst
 8
 9# Bad: Catching Exception when you mean something specific
10def bad_example_2():
11    try:
12        with open('file.txt', 'r') as f:
13            data = f.read()
14    except Exception:  # Too broad!
15        print("Something went wrong")
16
17# Bad: Not using exception chaining
18def bad_example_3():
19    try:
20        data = json.loads(text)
21    except json.JSONDecodeError:
22        raise ValueError("Bad JSON")  # Lost the original error!
23
24# Better versions
25def good_example_1():
26    try:
27        risky_operation()
28    except SpecificError as e:
29        logger.error(f"Expected error occurred: {e}")
30        # Handle or re-raise appropriately
31
32def good_example_2():
33    try:
34        with open('file.txt', 'r') as f:
35            data = f.read()
36    except FileNotFoundError:
37        print("File not found")
38    except PermissionError:
39        print("Permission denied")
40    except OSError as e:
41        print(f"OS error: {e}")
42
43def good_example_3():
44    try:
45        data = json.loads(text)
46    except json.JSONDecodeError as e:
47        raise ValueError("Bad JSON format") from e  # Chain the exceptions

Real-World Example: Configuration File Processor

Let’s put it all together with a realistic, Pythonic example:

  1import json
  2import os
  3import shutil
  4from datetime import datetime
  5from pathlib import Path
  6from typing import Any, Dict, Optional
  7from contextlib import contextmanager
  8
  9class ConfigProcessor:
 10    """Pythonic configuration file processor with robust error handling."""
 11    
 12    def __init__(self, config_dir: str = "config"):
 13        self.config_dir = Path(config_dir)
 14        self.config_dir.mkdir(exist_ok=True)
 15    
 16    def load_config(self, filename: str) -> Dict[str, Any]:
 17        """Load configuration from file with comprehensive error handling."""
 18        
 19        config_path = self.config_dir / filename
 20        
 21        if not config_path.exists():
 22            raise FileNotFoundError(f"Configuration file not found: {config_path}")
 23        
 24        if not config_path.is_file():
 25            raise IsADirectoryError(f"Path is a directory, not a file: {config_path}")
 26        
 27        try:
 28            # Use pathlib's read_text (Pythonic!)
 29            content = config_path.read_text(encoding='utf-8')
 30            
 31            # Support both JSON and simple key=value format
 32            if filename.endswith('.json'):
 33                config = json.loads(content)
 34            else:
 35                config = self._parse_key_value_format(content, str(config_path))
 36            
 37            print(f"Loaded configuration from {config_path}")
 38            return config
 39            
 40        except json.JSONDecodeError as e:
 41            raise ValueError(f"Invalid JSON in {config_path}: {e}") from e
 42        except UnicodeDecodeError as e:
 43            raise ValueError(f"Cannot decode {config_path}: {e}") from e
 44        except OSError as e:
 45            raise OSError(f"Error reading {config_path}: {e}") from e
 46    
 47    def _parse_key_value_format(self, content: str, filename: str) -> Dict[str, str]:
 48        """Parse simple key=value configuration format."""
 49        
 50        config = {}
 51        
 52        for line_num, line in enumerate(content.splitlines(), 1):
 53            line = line.strip()
 54            
 55            # Skip comments and empty lines
 56            if not line or line.startswith('#'):
 57                continue
 58            
 59            if '=' not in line:
 60                raise ValueError(
 61                    f"Invalid format at line {line_num} in {filename}: '{line}'"
 62                )
 63            
 64            key, _, value = line.partition('=')
 65            key = key.strip()
 66            value = value.strip()
 67            
 68            if not key:
 69                raise ValueError(f"Empty key at line {line_num} in {filename}")
 70            
 71            config[key] = value
 72        
 73        return config
 74    
 75    @contextmanager
 76    def atomic_write(self, filename: str):
 77        """Context manager for atomic file writing."""
 78        
 79        config_path = self.config_dir / filename
 80        temp_path = config_path.with_suffix('.tmp')
 81        backup_path = config_path.with_suffix('.backup')
 82        
 83        # Create backup if original exists
 84        if config_path.exists():
 85            try:
 86                shutil.copy2(config_path, backup_path)
 87            except OSError as e:
 88                raise OSError(f"Cannot create backup: {e}") from e
 89        
 90        try:
 91            # Yield temporary file path
 92            yield temp_path
 93            
 94            # Atomic move (as atomic as the OS allows)
 95            temp_path.replace(config_path)
 96            
 97            # Clean up backup on success
 98            backup_path.unlink(missing_ok=True)
 99            
100        except Exception:
101            # Clean up temp file on error
102            temp_path.unlink(missing_ok=True)
103            
104            # Restore backup if needed
105            if backup_path.exists() and not config_path.exists():
106                backup_path.replace(config_path)
107            
108            raise
109    
110    def save_config(self, filename: str, config: Dict[str, Any]) -> None:
111        """Save configuration to file atomically."""
112        
113        try:
114            with self.atomic_write(filename) as temp_path:
115                if filename.endswith('.json'):
116                    # Pretty-print JSON (Pythonic!)
117                    content = json.dumps(config, indent=2, sort_keys=True)
118                else:
119                    # Generate key=value format
120                    lines = [f"# Configuration file generated on {datetime.now()}"]
121                    lines.extend(f"{key}={value}" for key, value in sorted(config.items()))
122                    content = '\n'.join(lines)
123                
124                temp_path.write_text(content, encoding='utf-8')
125            
126            print(f"Successfully saved configuration to {self.config_dir / filename}")
127            
128        except OSError as e:
129            raise OSError(f"Error saving configuration: {e}") from e
130    
131    def update_config(self, filename: str, updates: Dict[str, Any]) -> Dict[str, Any]:
132        """Update existing configuration with new values."""
133        
134        try:
135            # Load existing config
136            config = self.load_config(filename)
137        except FileNotFoundError:
138            # Create new config if file doesn't exist
139            config = {}
140        
141        # Update with new values
142        config.update(updates)
143        
144        # Save updated config
145        self.save_config(filename, config)
146        
147        return config
148    
149    def validate_config(self, config: Dict[str, Any], required_keys: Optional[set] = None) -> bool:
150        """Validate configuration against requirements."""
151        
152        if required_keys:
153            missing_keys = required_keys - set(config.keys())
154            if missing_keys:
155                raise ValueError(f"Missing required configuration keys: {missing_keys}")
156        
157        # Add specific validation rules as needed
158        if 'port' in config:
159            try:
160                port = int(config['port'])
161                if not (1 <= port <= 65535):
162                    raise ValueError(f"Invalid port number: {port}")
163            except (ValueError, TypeError) as e:
164                raise ValueError(f"Invalid port value: {config['port']}") from e
165        
166        return True
167
168def demonstrate_config_processor():
169    """Demonstrate the configuration processor."""
170    
171    print("=== Configuration Processor Demo ===")
172    
173    processor = ConfigProcessor()
174    
175    # Sample configuration
176    sample_config = {
177        'database': {
178            'host': 'localhost',
179            'port': 5432,
180            'name': 'myapp'
181        },
182        'app': {
183            'name': 'My Application',
184            'debug': True,
185            'version': '1.2.3'
186        }
187    }
188    
189    try:
190        # Save configuration
191        processor.save_config('app.json', sample_config)
192        
193        # Load and validate
194        loaded_config = processor.load_config('app.json')
195        print(f"Loaded config keys: {list(loaded_config.keys())}")
196        
197        # Update configuration
198        updates = {'app': {'version': '1.3.0', 'debug': False}}
199        updated_config = processor.update_config('app.json', updates)
200        print(f"Updated version to: {updated_config['app']['version']}")
201        
202        # Test validation
203        processor.validate_config(updated_config)
204        print("Configuration validation passed")
205        
206    except (ValueError, OSError, FileNotFoundError) as e:
207        print(f"Configuration error: {e}")
208        if hasattr(e, '__cause__') and e.__cause__:
209            print(f"Root cause: {e.__cause__}")
210    
211    # Test error handling
212    try:
213        processor.load_config('nonexistent.json')
214    except FileNotFoundError as e:
215        print(f"Expected error: {e}")
216
217# Run the demonstration
218demonstrate_config_processor()

The Pythonic Bottom Line

Python’s approach to exception handling is built around these core principles:

  1. EAFP over LBYL - Try first, handle exceptions rather than checking conditions
  2. Specific exceptions - Catch exactly what you expect, let unexpected errors bubble up
  3. Exception chaining - Use from to preserve the original error context
  4. Context managers - Use with statements for resource management
  5. Fail fast, fail clearly - Provide helpful error messages with context

The most important thing to remember? Python exceptions aren’t failures - they’re information. They tell you exactly what went wrong and where, making debugging much easier than silent failures or cryptic error codes.

Start using these patterns in your file I/O operations, and you’ll quickly see why Python developers love the language’s approach to error handling. It’s not just about preventing crashes - it’s about writing code that’s robust, maintainable, and actually pleasant to debug when things do go wrong.

Remember: the goal isn’t to catch every possible exception. The goal is to handle the ones you can do something about, and let the rest bubble up with enough context to be useful. That’s the Pythonic way.

Advanced Pythonic Patterns

Let’s look at some more sophisticated exception handling patterns that you’ll use in real applications:

Using Custom Exception Hierarchies

  1class DataProcessingError(Exception):
  2    """Base exception for data processing errors."""
  3    pass
  4
  5class ValidationError(DataProcessingError):
  6    """Data failed validation."""
  7    
  8    def __init__(self, message: str, field_name: str = None, value: Any = None):
  9        super().__init__(message)
 10        self.field_name = field_name
 11        self.value = value
 12    
 13    def __str__(self):
 14        base_msg = super().__str__()
 15        if self.field_name:
 16            base_msg += f" (field: {self.field_name}"
 17            if self.value is not None:
 18                base_msg += f", value: {self.value}"
 19            base_msg += ")"
 20        return base_msg
 21
 22class FormatError(DataProcessingError):
 23    """Data format is incorrect."""
 24    pass
 25
 26class ProcessingTimeoutError(DataProcessingError):
 27    """Processing took too long."""
 28    pass
 29
 30def validate_user_record(record: Dict[str, Any]) -> Dict[str, Any]:
 31    """Validate a user record with specific exceptions."""
 32    
 33    validated = {}
 34    
 35    # Required fields
 36    required_fields = {'name', 'email', 'age'}
 37    missing_fields = required_fields - set(record.keys())
 38    if missing_fields:
 39        raise ValidationError(f"Missing required fields: {missing_fields}")
 40    
 41    # Validate name
 42    name = record['name']
 43    if not isinstance(name, str) or not name.strip():
 44        raise ValidationError("Name must be a non-empty string", "name", name)
 45    validated['name'] = name.strip()
 46    
 47    # Validate email
 48    email = record['email']
 49    if not isinstance(email, str) or '@' not in email:
 50        raise ValidationError("Invalid email format", "email", email)
 51    validated['email'] = email.lower().strip()
 52    
 53    # Validate age
 54    try:
 55        age = int(record['age'])
 56        if age < 0 or age > 150:
 57            raise ValidationError("Age must be between 0 and 150", "age", age)
 58        validated['age'] = age
 59    except (ValueError, TypeError) as e:
 60        raise ValidationError("Age must be a valid integer", "age", record['age']) from e
 61    
 62    return validated
 63
 64def process_user_file(filename: str) -> List[Dict[str, Any]]:
 65    """Process a file of user records with detailed error reporting."""
 66    
 67    valid_records = []
 68    errors = []
 69    
 70    try:
 71        with open(filename, 'r', encoding='utf-8') as f:
 72            data = json.load(f)
 73    except FileNotFoundError:
 74        raise DataProcessingError(f"User file not found: {filename}")
 75    except json.JSONDecodeError as e:
 76        raise FormatError(f"Invalid JSON in {filename}: {e}") from e
 77    except OSError as e:
 78        raise DataProcessingError(f"Cannot read {filename}: {e}") from e
 79    
 80    if not isinstance(data, list):
 81        raise FormatError("File must contain a JSON array of user records")
 82    
 83    for index, record in enumerate(data):
 84        try:
 85            validated_record = validate_user_record(record)
 86            valid_records.append(validated_record)
 87        except ValidationError as e:
 88            error_msg = f"Record {index + 1}: {e}"
 89            errors.append(error_msg)
 90            print(f"⚠️  {error_msg}")
 91    
 92    if errors and not valid_records:
 93        raise DataProcessingError(f"No valid records found. Errors: {len(errors)}")
 94    
 95    if errors:
 96        print(f"✓ Processed {len(valid_records)} valid records, {len(errors)} errors")
 97    else:
 98        print(f"✓ Successfully processed all {len(valid_records)} records")
 99    
100    return valid_records

Decorator-Based Exception Handling

 1import functools
 2import time
 3from typing import Callable, Type, Union, Tuple
 4
 5def retry_on_exception(
 6    exceptions: Union[Type[Exception], Tuple[Type[Exception], ...]] = Exception,
 7    max_retries: int = 3,
 8    delay: float = 1.0,
 9    backoff_factor: float = 2.0
10):
11    """Decorator to retry function calls on specific exceptions."""
12    
13    def decorator(func: Callable):
14        @functools.wraps(func)
15        def wrapper(*args, **kwargs):
16            last_exception = None
17            
18            for attempt in range(max_retries + 1):
19                try:
20                    return func(*args, **kwargs)
21                except exceptions as e:
22                    last_exception = e
23                    
24                    if attempt == max_retries:
25                        print(f"❌ Final attempt failed for {func.__name__}")
26                        raise
27                    
28                    sleep_time = delay * (backoff_factor ** attempt)
29                    print(f"🔄 Attempt {attempt + 1} failed for {func.__name__}, "
30                          f"retrying in {sleep_time:.1f}s: {e}")
31                    time.sleep(sleep_time)
32            
33            # This shouldn't be reached, but just in case
34            raise last_exception
35        
36        return wrapper
37    return decorator
38
39def handle_exceptions(*exception_types: Type[Exception], default_return=None):
40    """Decorator to handle specific exceptions and return default value."""
41    
42    def decorator(func: Callable):
43        @functools.wraps(func)
44        def wrapper(*args, **kwargs):
45            try:
46                return func(*args, **kwargs)
47            except exception_types as e:
48                print(f"⚠️  {func.__name__} failed with {type(e).__name__}: {e}")
49                return default_return
50        
51        return wrapper
52    return decorator
53
54# Example usage of decorators
55@retry_on_exception(OSError, max_retries=3, delay=0.5)
56def read_remote_file(url: str) -> str:
57    """Simulate reading from a potentially unreliable source."""
58    import random
59    
60    if random.choice([True, False, False]):  # 33% chance of success
61        return f"Content from {url}"
62    else:
63        raise OSError("Network connection failed")
64
65@handle_exceptions(FileNotFoundError, ValueError, default_return=[])
66def load_data_safe(filename: str) -> List[Dict[str, Any]]:
67    """Load data with automatic error handling."""
68    with open(filename, 'r') as f:
69        return json.load(f)
70
71def demonstrate_decorators():
72    """Demonstrate decorator-based exception handling."""
73    print("=== Decorator-Based Exception Handling ===")
74    
75    # Test retry decorator
76    try:
77        content = read_remote_file("https://api.example.com/data")
78        print(f"✓ Successfully retrieved: {content}")
79    except OSError as e:
80        print(f"❌ Failed after all retries: {e}")
81    
82    # Test error handling decorator
83    data = load_data_safe("nonexistent.json")
84    print(f"Loaded data (with defaults): {data}")

Context Managers for Resource Management

 1import sqlite3
 2import tempfile
 3from contextlib import contextmanager
 4from typing import Generator
 5
 6@contextmanager
 7def database_transaction(db_path: str) -> Generator[sqlite3.Connection, None, None]:
 8    """Context manager for database transactions with automatic rollback."""
 9    
10    conn = None
11    try:
12        conn = sqlite3.connect(db_path)
13        conn.execute("BEGIN")
14        print(f"🔄 Started transaction on {db_path}")
15        
16        yield conn
17        
18        conn.commit()
19        print("✓ Transaction committed successfully")
20        
21    except Exception as e:
22        if conn:
23            conn.rollback()
24            print(f"❌ Transaction rolled back due to error: {e}")
25        raise
26    finally:
27        if conn:
28            conn.close()
29            print("🔒 Database connection closed")
30
31@contextmanager
32def temporary_file_with_content(content: str, suffix: str = '.txt') -> Generator[Path, None, None]:
33    """Context manager for temporary files with automatic cleanup."""
34    
35    temp_file = None
36    try:
37        # Create temporary file
38        with tempfile.NamedTemporaryFile(mode='w', suffix=suffix, delete=False) as f:
39            f.write(content)
40            temp_file = Path(f.name)
41        
42        print(f"📝 Created temporary file: {temp_file}")
43        yield temp_file
44        
45    finally:
46        # Always clean up
47        if temp_file and temp_file.exists():
48            temp_file.unlink()
49            print(f"🗑️  Cleaned up temporary file: {temp_file}")
50
51@contextmanager
52def error_context(operation_name: str):
53    """Context manager to provide operation context for errors."""
54    
55    print(f"🚀 Starting {operation_name}")
56    try:
57        yield
58        print(f"✅ Completed {operation_name}")
59    except Exception as e:
60        print(f"❌ Failed during {operation_name}: {type(e).__name__}: {e}")
61        raise
62
63def demonstrate_context_managers():
64    """Demonstrate custom context managers."""
65    print("=== Context Manager Demonstrations ===")
66    
67    # Database transaction example
68    with temporary_file_with_content("", suffix='.db') as db_file:
69        try:
70            with database_transaction(str(db_file)) as conn:
71                # Create table and insert data
72                conn.execute("""
73                    CREATE TABLE users (id INTEGER PRIMARY KEY, name TEXT, email TEXT)
74                """)
75                conn.execute("INSERT INTO users (name, email) VALUES (?, ?)", 
76                           ("Alice", "alice@example.com"))
77                conn.execute("INSERT INTO users (name, email) VALUES (?, ?)", 
78                           ("Bob", "bob@example.com"))
79                
80                # This would cause a rollback if uncommented
81                # raise ValueError("Simulated error")
82                
83        except Exception as e:
84            print(f"Database operation failed: {e}")
85    
86    # File processing with error context
87    sample_content = '{"users": [{"name": "Charlie", "age": 30}]}'
88    
89    with temporary_file_with_content(sample_content, '.json') as temp_file:
90        with error_context("JSON file processing"):
91            data = json.loads(temp_file.read_text())
92            print(f"Processed data: {data}")

Error Aggregation and Reporting

  1from dataclasses import dataclass
  2from typing import List, Optional
  3from enum import Enum
  4
  5class ErrorSeverity(Enum):
  6    INFO = "info"
  7    WARNING = "warning"
  8    ERROR = "error"
  9    CRITICAL = "critical"
 10
 11@dataclass
 12class ProcessingError:
 13    """Structured error information."""
 14    severity: ErrorSeverity
 15    message: str
 16    source: str
 17    line_number: Optional[int] = None
 18    exception: Optional[Exception] = None
 19    
 20    def __str__(self):
 21        base = f"[{self.severity.value.upper()}] {self.source}: {self.message}"
 22        if self.line_number:
 23            base += f" (line {self.line_number})"
 24        return base
 25
 26class ErrorCollector:
 27    """Collect and manage errors during processing."""
 28    
 29    def __init__(self):
 30        self.errors: List[ProcessingError] = []
 31    
 32    def add_error(self, severity: ErrorSeverity, message: str, source: str, 
 33                  line_number: Optional[int] = None, exception: Optional[Exception] = None):
 34        """Add an error to the collection."""
 35        error = ProcessingError(severity, message, source, line_number, exception)
 36        self.errors.append(error)
 37        
 38        # Print immediately for serious errors
 39        if severity in (ErrorSeverity.ERROR, ErrorSeverity.CRITICAL):
 40            print(f"❌ {error}")
 41        elif severity == ErrorSeverity.WARNING:
 42            print(f"⚠️  {error}")
 43    
 44    def has_errors(self, min_severity: ErrorSeverity = ErrorSeverity.ERROR) -> bool:
 45        """Check if there are errors of at least the specified severity."""
 46        severity_levels = {
 47            ErrorSeverity.INFO: 0,
 48            ErrorSeverity.WARNING: 1,
 49            ErrorSeverity.ERROR: 2,
 50            ErrorSeverity.CRITICAL: 3
 51        }
 52        
 53        min_level = severity_levels[min_severity]
 54        return any(severity_levels[error.severity] >= min_level for error in self.errors)
 55    
 56    def get_summary(self) -> Dict[str, int]:
 57        """Get error count summary by severity."""
 58        summary = {severity.value: 0 for severity in ErrorSeverity}
 59        for error in self.errors:
 60            summary[error.severity.value] += 1
 61        return summary
 62    
 63    def print_summary(self):
 64        """Print error summary."""
 65        if not self.errors:
 66            print("✅ No errors recorded")
 67            return
 68        
 69        summary = self.get_summary()
 70        print("\n📊 Error Summary:")
 71        for severity, count in summary.items():
 72            if count > 0:
 73                icon = {"info": "ℹ️", "warning": "⚠️", "error": "❌", "critical": "🚨"}[severity]
 74                print(f"  {icon} {severity.capitalize()}: {count}")
 75
 76def process_multiple_files_with_error_collection(filenames: List[str]) -> ErrorCollector:
 77    """Process multiple files and collect all errors."""
 78    
 79    error_collector = ErrorCollector()
 80    successful_files = 0
 81    
 82    for filename in filenames:
 83        try:
 84            # Create test file
 85            create_test_data_file(filename)
 86            
 87            with error_context(f"processing {filename}"):
 88                data = load_and_validate_data(filename, error_collector)
 89                if data:
 90                    successful_files += 1
 91                    error_collector.add_error(
 92                        ErrorSeverity.INFO, 
 93                        f"Successfully processed {len(data)} records", 
 94                        filename
 95                    )
 96        
 97        except Exception as e:
 98            error_collector.add_error(
 99                ErrorSeverity.CRITICAL,
100                f"Unexpected error: {e}",
101                filename,
102                exception=e
103            )
104    
105    print(f"\n📈 Processing Results: {successful_files}/{len(filenames)} files successful")
106    error_collector.print_summary()
107    
108    return error_collector
109
110def load_and_validate_data(filename: str, error_collector: ErrorCollector) -> List[Dict[str, Any]]:
111    """Load and validate data from file, collecting errors."""
112    
113    try:
114        with open(filename, 'r', encoding='utf-8') as f:
115            data = json.load(f)
116    except FileNotFoundError:
117        error_collector.add_error(ErrorSeverity.ERROR, "File not found", filename)
118        return []
119    except json.JSONDecodeError as e:
120        error_collector.add_error(ErrorSeverity.ERROR, f"Invalid JSON: {e}", filename)
121        return []
122    except OSError as e:
123        error_collector.add_error(ErrorSeverity.ERROR, f"Cannot read file: {e}", filename)
124        return []
125    
126    if not isinstance(data, list):
127        error_collector.add_error(ErrorSeverity.ERROR, "Data must be a list", filename)
128        return []
129    
130    valid_records = []
131    
132    for index, record in enumerate(data):
133        try:
134            validated = validate_user_record(record)
135            valid_records.append(validated)
136        except ValidationError as e:
137            error_collector.add_error(
138                ErrorSeverity.WARNING,
139                str(e),
140                filename,
141                line_number=index + 1,
142                exception=e
143            )
144    
145    return valid_records
146
147def create_test_data_file(filename: str):
148    """Create test data files with various error conditions."""
149    
150    test_data = {
151        "valid_users.json": [
152            {"name": "Alice Smith", "email": "alice@example.com", "age": 30},
153            {"name": "Bob Jones", "email": "bob@example.com", "age": 25}
154        ],
155        "mixed_users.json": [
156            {"name": "Valid User", "email": "valid@example.com", "age": 28},
157            {"name": "", "email": "empty_name@example.com", "age": 35},  # Invalid
158            {"name": "No Email", "age": 40},  # Missing email
159            {"name": "Invalid Age", "email": "invalid@example.com", "age": "not_a_number"}  # Invalid age
160        ],
161        "invalid_format.json": {"not": "a list"},  # Wrong format
162    }
163    
164    data = test_data.get(filename, [{"name": "Default User", "email": "default@example.com", "age": 25}])
165    
166    try:
167        with open(filename, 'w', encoding='utf-8') as f:
168            json.dump(data, f, indent=2)
169    except OSError as e:
170        print(f"Error creating test file {filename}: {e}")
171
172def demonstrate_error_collection():
173    """Demonstrate error collection and reporting."""
174    print("=== Error Collection and Reporting ===")
175    
176    test_files = ["valid_users.json", "mixed_users.json", "invalid_format.json", "nonexistent.json"]
177    error_collector = process_multiple_files_with_error_collection(test_files)
178    
179    # Check if processing should continue based on error severity
180    if error_collector.has_errors(ErrorSeverity.CRITICAL):
181        print("\n🚨 Critical errors detected - processing cannot continue")
182    elif error_collector.has_errors(ErrorSeverity.ERROR):
183        print("\n⚠️  Errors detected - review results carefully")
184    else:
185        print("\n✅ Processing completed successfully")
186    
187    return error_collector

Performance Considerations with Exceptions

 1import timeit
 2from typing import Union
 3
 4def demonstrate_exception_performance():
 5    """Show the performance implications of exception handling."""
 6    print("=== Exception Performance Considerations ===")
 7    
 8    # Test EAFP vs LBYL performance
 9    data = {"valid_key": 42}
10    
11    def eafp_approach():
12        """Easier to Ask for Forgiveness than Permission."""
13        try:
14            return data["some_key"]
15        except KeyError:
16            return None
17    
18    def lbyl_approach():
19        """Look Before You Leap."""
20        if "some_key" in data:
21            return data["some_key"]
22        else:
23            return None
24    
25    # Time both approaches
26    eafp_time = timeit.timeit(eafp_approach, number=100000)
27    lbyl_time = timeit.timeit(lbyl_approach, number=100000)
28    
29    print(f"EAFP (with exception): {eafp_time:.4f} seconds")
30    print(f"LBYL (with check): {lbyl_time:.4f} seconds")
31    print(f"LBYL is {eafp_time/lbyl_time:.1f}x faster when exceptions occur frequently")
32    
33    # However, when the key exists...
34    data_with_key = {"some_key": 42}
35    
36    def eafp_success():
37        try:
38            return data_with_key["some_key"]
39        except KeyError:
40            return None
41    
42    def lbyl_success():
43        if "some_key" in data_with_key:
44            return data_with_key["some_key"]
45        else:
46            return None
47    
48    eafp_success_time = timeit.timeit(eafp_success, number=100000)
49    lbyl_success_time = timeit.timeit(lbyl_success, number=100000)
50    
51    print(f"\nWhen key exists:")
52    print(f"EAFP (no exception): {eafp_success_time:.4f} seconds")
53    print(f"LBYL (with check): {lbyl_success_time:.4f} seconds")
54    print(f"EAFP is {lbyl_success_time/eafp_success_time:.1f}x faster when exceptions don't occur")
55    
56    print("\n💡 Takeaway: Use EAFP when exceptions are rare, LBYL when they're common")
57
58# Run all demonstrations
59if __name__ == "__main__":
60    demonstrate_decorators()
61    print("\n" + "="*60)
62    demonstrate_context_managers()
63    print("\n" + "="*60)
64    demonstrate_error_collection()
65    print("\n" + "="*60)
66    demonstrate_exception_performance()

Quick Reference: Common Exception Patterns

Here’s your cheat sheet for the most common Pythonic exception handling patterns:

 1# 1. File operations
 2try:
 3    with open('file.txt', 'r') as f:
 4        content = f.read()
 5except FileNotFoundError:
 6    print("File not found")
 7except PermissionError:
 8    print("Permission denied")
 9
10# 2. JSON parsing with context
11try:
12    data = json.loads(text)
13except json.JSONDecodeError as e:
14    raise ValueError(f"Invalid JSON: {e}") from e
15
16# 3. Multiple exceptions, specific handling
17try:
18    result = risky_operation()
19except (ValueError, TypeError) as e:
20    print(f"Input error: {e}")
21except OSError as e:
22    print(f"System error: {e}")
23
24# 4. Exception chaining
25try:
26    process_data(data)
27except DataError as e:
28    raise ProcessingError("Failed to process") from e
29
30# 5. Using else and finally
31try:
32    result = operation()
33except SpecificError:
34    handle_error()
35else:
36    # Only runs if no exception
37    success_action()
38finally:
39    # Always runs
40    cleanup()
41
42# 6. Custom exceptions with context
43class ValidationError(Exception):
44    def __init__(self, message, field=None):
45        super().__init__(message)
46        self.field = field
47
48# 7. Defensive programming with defaults
49def safe_int(value, default=0):
50    try:
51        return int(value)
52    except (ValueError, TypeError):
53        return default

That’s Python exception handling in a nutshell - elegant, explicit, and designed to make your code more robust. The key is to embrace exceptions as a natural part of Python programming, not something to avoid. When you handle them thoughtfully, they make your code cleaner and more maintainable than defensive programming approaches.