Setting up TypeScript in a Project

  • Setting up TypeScript in a project is an important first step to leverage its benefits. Here’s a detailed step-by-step guide on how to set it up in any JavaScript project.
Step 1: Install Node.js
  • TypeScript runs on Node.js, so you need to ensure you have Node.js installed.
  • Once installed, check the version of Node.js and npm (Node Package Manager) using:

   node -v
   npm -v


Step 2: Initialize a New Project (optional)

  • If you don’t have an existing project, create a new project folder and initialize it with 'npm init' to generate a 'package.json' file.

   mkdir my-typescript-project
   cd my-typescript-project
   npm init -y

  • The '-y' flag automatically answers "yes" to all prompts, creating a default 'package.json'.
Step 3: Install TypeScript
  • Now you need to install TypeScript locally in your project. You can install it using npm (or yarn, if preferred).

   npm install typescript --save-dev

  • This will add TypeScript as a development dependency ('--save-dev') and make the 'tsc' (TypeScript compiler) command available to use.
Step 4: Initialize TypeScript Configuration (tsconfig.json)
  • To customize and configure TypeScript settings, you need to create a 'tsconfig.json' file. This file defines the root of your project and how TypeScript should be compiled.
  • You can generate it by running the following command:

   npx tsc --init

  • This command creates a 'tsconfig.json' file with default settings. A basic 'tsconfig.json' might look like this:

    {
        "compilerOptions": {
            "target": "es6",
            "module": "commonjs",
            "strict": true,
            "esModuleInterop": true,
            "skipLibCheck": true
        },
        "include": [
            "src/**/*"
        ],
        "exclude": [
            "node_modules"
        ]
    }

  • target: Specifies which version of JavaScript your TypeScript will compile down to. Common options are 'es5' or 'es6'.
  • module: Specifies the module system (e.g., 'commonjs', 'esnext').
  • strict: Enables all strict type-checking options.
  • esModuleInterop: Allows compatibility with ES6 module imports and CommonJS.
  • include: Tells TypeScript which files to include (here, the 'src' folder).
  • exclude: Excludes files/folders from compilation (e.g., 'node_modules').
Step 5: Create Your First TypeScript File
  • Now that TypeScript is set up, create a folder called 'src' and a new TypeScript file inside it, 'index.ts':

   mkdir src
   touch src/index.ts

  • In 'index.ts', write some simple TypeScript code:

   const message: string = "Hello, TypeScript!";
   console.log(message);


Step 6: Compile TypeScript

  • To compile your TypeScript code into JavaScript, you can use the 'tsc' command:

   npx tsc

  • This will compile all TypeScript files in the project (as per the settings in 'tsconfig.json') and generate corresponding JavaScript files in the same directory or in a specified 'outDir' (if configured).
Step 7: Run the Compiled JavaScript
  • Now, run the compiled JavaScript using Node.js:

   node src/index.js

  • You should see the following output in your terminal:

    Hello, TypeScript!


Step 8: Automate Compilation (Optional)

  • If you want TypeScript to automatically compile every time you make a change, you can use the '--watch' option:

   npx tsc --watch

  • This will keep watching for file changes and recompile your TypeScript files automatically.
Summary
  • By following these steps, you have successfully set up TypeScript in your project. Here's what we accomplished:
    1. Installed TypeScript using npm.
    2. Generated the 'tsconfig.json' configuration file.
    3. Wrote a simple TypeScript file.
    4. Compiled TypeScript into JavaScript and ran it using Node.js.
  • Now your project is TypeScript-enabled, and you can start building more complex applications using TypeScript’s powerful features like static typing and modern JavaScript support.

Table of Contents for Typescript

  • Here’s a detailed Table of Contents (TOC) for your TypeScript tutorial series, covering concepts from basic to advanced. The structure follows a progressive learning path, ensuring your audience builds a solid foundation before moving on to advanced topics.
1. Introduction to TypeScript
  • What is TypeScript?
  • Why use TypeScript over JavaScript?
  • Installing TypeScript
  • Setting up TypeScript in a project
  • Compiling TypeScript to JavaScript
  • TypeScript vs. JavaScript: Key Differences
2. Basic Types
  • Primitive Types: 'number', 'string', 'boolean', 'null', 'undefined'
  • Array and Tuple Types
  • Any, Unknown, Never Types
  • Type Inference and Explicit Typing
3. Functions in TypeScript
  • Basic Function Types and Signatures
  • Optional and Default Parameters
  • Rest Parameters
  • Function Overloading
  • Arrow Functions
4. Interfaces and Object Types
  • Defining Interfaces
  • Optional and Readonly Properties
  • Extending Interfaces
  • Intersection Types
  • Type Assertions
5. Classes and Object-Oriented Programming (OOP)
  • Classes: Basics and Constructors
  • Inheritance and Super
  • Access Modifiers: 'public', 'private', 'protected'
  • Static Properties and Methods
  • Getters and Setters
  • Abstract Classes
6. Advanced Types
  • Union and Intersection Types
  • Literal Types
  • Type Aliases
  • Type Guards and Type Narrowing
  • Discriminated Unions
  • Type Compatibility and Subtype Polymorphism
7. Enums in TypeScript
  • Numeric Enums
  • String Enums
  • Heterogeneous Enums
  • Enum Member Types
8. Generics
  • Introduction to Generics
  • Generic Functions
  • Generic Classes
  • Generic Constraints
  • Using 'keyof' and 'typeof' with Generics
9. Modules and Namespaces
  • Working with ES Modules in TypeScript
  • Export and Import Statements
  • Namespaces vs. Modules
  • Code Splitting and Modularization
10. TypeScript with JavaScript Libraries
  • Using '.d.ts' Files
  • Working with Third-Party JavaScript Libraries
  • @types Definitions for Popular Libraries
  • Mixing TypeScript and JavaScript
11. TypeScript Compiler (tsc) Configuration
  • Configuring tsconfig.json
  • Key Compiler Options
  • Strict Type Checking Options
  • Incremental Compilation
12. Asynchronous Programming in TypeScript
  • Promises in TypeScript
  • Async/Await with TypeScript
  • Error Handling with Async Functions
  • Typed Promises
13. TypeScript with React
  • TypeScript Basics with React
  • Typing Functional Components
  • Typing Props and State
  • Using Hooks with TypeScript (useState, useEffect, etc.)
  • Typing Events and Forms in React
14. TypeScript with Node.js
  • Setting up TypeScript with Node.js
  • Working with Type Definitions in Node.js
  • Building Express Apps with TypeScript
  • Handling Asynchronous Operations with Node.js and TypeScript
15. TypeScript with Frameworks and Tools
  • TypeScript with Next.js
  • TypeScript with Angular
  • TypeScript with Vue.js
  • TypeScript with Redux
16. Advanced TypeScript Concepts
  • Conditional Types
  • Mapped Types
  • Template Literal Types
  • Utility Types ('Partial', 'Required', 'Pick', 'Omit')
  • Recursive Types
17. Error Handling and Debugging
  • Common TypeScript Errors
  • Debugging TypeScript Code
  • Linting TypeScript with ESLint
  • Using TypeScript with Prettier
18. Testing in TypeScript
  • Unit Testing with TypeScript
  • Using Jest with TypeScript
  • Mocking and Assertions in TypeScript Tests
  • Integration Testing with TypeScript
19. TypeScript Project Best Practices
  • Organizing a TypeScript Project
  • Code Documentation and Comments
  • Type-Safe API Calls
  • Optimizing Build and Compilation Times
20. Deploying TypeScript Applications
  • Building for Production
  • Transpiling and Bundling TypeScript with Webpack or Vite
  • Deploying TypeScript Apps to Cloud Platforms (Vercel, Netlify, etc.)
21. TypeScript Performance and Optimization
  • Optimizing TypeScript Code for Performance
  • Tree Shaking with TypeScript
  • Improving Compilation Speed
  • Memory Management and Efficient Type Checking
Bonus Topics (Optional for Advanced Audience)
  • Decorator Patterns in TypeScript
  • Meta-Programming with TypeScript
  • TypeScript and GraphQL
  • TypeScript for Large-Scale Applications
  • Migrating Legacy JavaScript to TypeScript
  • This structure ensures that learners can move from the very basics of TypeScript to more complex, real-world applications. You can adjust the depth of each topic based on the target audience, including code examples, projects, or exercises in each section to make the tutorial more interactive. 

Schema Validation In MongoDB

  • In MongoDB, schema validation is a way to ensure that documents inserted into a collection adhere to a specific structure or pattern. This is done by defining rules or constraints using a JSON Schema. Schema validation is not mandatory in MongoDB because of its flexible schema model, but it can be helpful when you want to enforce data integrity or avoid errors due to improperly formatted data.
  • Here is a detailed breakdown of schema validation in MongoDB:
What is Schema Validation?
  • Schema validation defines rules that must be followed when inserting, updating, or replacing documents in a collection. It ensures that the documents conform to a specific structure, type, or other constraints, which helps in maintaining data quality.
Schema Validation Concepts
  • JSON Schema: MongoDB’s schema validation is based on the JSON Schema standard. JSON Schema provides a way to describe the structure of JSON data.
  • Validation Level: Defines how strictly the rules will be enforced.
    • 'strict': Validation is applied for both inserts and updates.
    • 'moderate': Validation is applied only for new documents and updates that affect fields mentioned in the schema.
  • Validation Action: Defines the behavior when documents don't meet the validation rules.
    • 'error': Rejects documents that don’t follow the validation rules.
    • 'warn': Allows invalid documents but logs a warning.
  • Validation Rules: You define rules (constraints) that documents must adhere to. These could include:
    • Data types (e.g., string, number, object)
    • Required fields
    • Field value constraints (e.g., regex, minimum/maximum values)
    • Nested object validation
Creating Schema Validation
  • Schema validation is usually done during collection creation, but it can also be applied to an existing collection.
Example of a Simple Schema Validation
  • Let’s say we are creating a 'users' collection where each document must have the following structure:
    • 'name': a required string field.
    • 'age': an optional integer field that must be between 18 and 60.
    • 'email': a required string field that must match a valid email format.
    • 'address': an object containing:
      • 'city': a required string field.
      • 'zipcode': a required string field with a 5-digit format.
  • Here’s how we can define schema validation for this collection:

  db.createCollection("users", {
    validator: {
      $jsonSchema: {
        bsonType: "object",
        required: ["name", "email", "address"],
        properties: {
          name: {
            bsonType: "string",
            description: "must be a string and is required"
          },
          age: {
            bsonType: "int",
            minimum: 18,
            maximum: 60,
            description: "must be an integer in the range 18 to 60"
          },
          email: {
            bsonType: "string",
            pattern: "^.+@.+$",
            description: "must be a valid email and is required"
          },
          address: {
            bsonType: "object",
            required: ["city", "zipcode"],
            properties: {
              city: {
                bsonType: "string",
                description: "must be a string and is required"
              },
              zipcode: {
                bsonType: "string",
                pattern: "^[0-9]{5}$",
                description: "must be a 5-digit string and is required"
              }
            }
          }
        }
      }
    },
    validationLevel: "strict",
    validationAction: "error"
  });


Breakdown of the Example

  • 'bsonType: "object"': This indicates that each document in the 'users' collection must be a BSON object (i.e., a document).
  • 'required: ["name", "email", "address"]': This specifies that the fields 'name', 'email', and 'address' must be present in every document.
  • Field Properties:
    • 'name': Must be a string and is required.
    • 'age': Must be an integer, and its value must fall between 18 and 60, but it's optional.
    • 'email': Must be a string, and it must match a basic email pattern ('^.+@.+$').
    • 'address': Must be an object containing the fields 'city' and 'zipcode'. Each of these fields must be a string, and 'zipcode' must follow a 5-digit format.
Validation Levels
  • Strict: All insert and update operations will be checked against the validation rules. For example, inserting a document like:

  db.users.insertOne({
    name: "John Doe",
    age: 30,
    email: "john@example.com",
    address: { city: "New York", zipcode: "10001" }
  });

  • would succeed because it meets the validation rules.
  • But, inserting this:

  db.users.insertOne({
    name: "Jane Doe",
    age: 17,  // invalid age
    email: "janeexample.com", // invalid email format
    address: { city: "Los Angeles", zipcode: "9001" }  // invalid zipcode
  });

  • would throw an error because 'age' is less than 18, the email is not valid, and the zipcode is not 5 digits.
  • Moderate: Only new documents or updated fields that are mentioned in the schema will be validated. So, if you are updating just the 'name' of an existing document, it won’t validate other fields like 'age' or 'email'.
Validation Actions
  • Error: Invalid documents will be rejected, and an error message will be thrown. This is helpful when you want to enforce strict data integrity.
  • Warn: The operation will still succeed, but MongoDB will log a warning that the document does not conform to the validation rules. This is useful for transitioning to schema validation without immediately enforcing strict validation.
Updating an Existing Collection's Validation
  • If you already have a collection, you can add schema validation using the 'collMod' command:

  db.runCommand({
    collMod: "users",
    validator: {
      $jsonSchema: {
        bsonType: "object",
        required: ["name", "email", "address"],
        properties: {
          name: {
            bsonType: "string",
            description: "must be a string and is required"
          },
          email: {
            bsonType: "string",
            pattern: "^.+@.+$",
            description: "must be a valid email and is required"
          }
        }
      }
    },
    validationLevel: "strict",
    validationAction: "error"
  });


Common Use Cases of Schema Validation

  • Data Quality: Ensure that all documents conform to a set of rules, preventing incorrect data entry.
  • Type Enforcement: Enforce types like 'string', 'number', 'date', etc., so that data can be relied upon during processing.
  • Range or Format Constraints: Ensure fields like 'age' fall within a specific range or that fields like 'email' or 'zipcode' follow a specific format.
  • Required Fields: Make sure certain fields are always present in a document.
Best Practices
  • Define schema validation during collection creation to enforce clean data from the beginning.
  • Use moderate validation during data migration or when transitioning to a more structured schema.
  • Use the 'warn' action for testing new validation rules without disrupting operations.
  • Schema validation in MongoDB is a powerful feature that lets you enforce structure in a flexible, NoSQL database. It helps ensure data integrity while still maintaining the flexibility that MongoDB is known for.

Change Streams in MongoDB

  • Change Streams in MongoDB allow applications to listen for real-time changes in collections, databases, or even the entire cluster. This capability is especially useful for applications that require immediate responses to data changes, such as real-time notifications, auditing, data synchronization, and event-driven architectures.
  • Introduced in MongoDB 3.6, change streams allow developers to track operations like insertions, updates, deletions, and more. They provide an event-driven way to handle data changes without the need for periodic polling, making them both efficient and scalable.
What are Change Streams?
  • Change streams are a MongoDB feature that provides a way to track and react to changes happening within a collection or database. When a change occurs (like inserting a document or updating an existing one), MongoDB emits an event through the change stream that can be consumed by the application in real-time.
Change streams can be used on:
  • Collections: Listen to changes in a specific collection.
  • Databases: Listen to changes in all collections within a database.
  • Clusters: Listen to changes across the entire cluster.
  • MongoDB uses the oplog (operation log) to track these changes and deliver them to the change stream consumers.
Key Features of Change Streams
  • Real-Time Updates: React to data changes in real time as they happen in the database.
  • Scalability: Change streams are efficient because they utilize MongoDB's oplog, allowing for real-time notifications without performance penalties.
  • Resumability: If your application goes down, you can resume watching the change stream from the last event received using a resume token.
  • Filtering Changes: You can filter specific types of changes (e.g., only inserts, or only changes in specific fields) using MongoDB's aggregation framework.
  • Cluster, Database, and Collection Level: You can listen to changes across an entire cluster, a single database, or just a single collection, depending on the level of granularity required.
  • Event-Driven Architecture: Change streams can be used to trigger actions in response to specific changes in the database, enabling an event-driven system.
Types of Events in Change Streams
  • When a change happens, the change stream emits a change document. Each change document contains detailed information about the change event, including:
    • operationType: The type of operation that triggered the event, such as insert, update, delete, etc.
    • documentKey: The unique _id of the document that was modified.
    • fullDocument: The complete document as it exists after the change (for insert or update operations).
    • updateDescription: For update operations, this field provides the changes made, showing the fields that were modified.
  • The common operationType values are:
    • insert: A new document has been inserted into the collection.
    • update: An existing document has been updated.
    • replace: A document has been replaced.
    • delete: A document has been deleted.
    • invalidate: The change stream is no longer valid (e.g., if the collection is dropped).
How Change Streams Work
  • Change streams utilize the oplog (operation log) of MongoDB’s replication system. The oplog is a special capped collection that stores a log of every operation that modifies the data in the database (e.g., insert, update, delete).
When a change happens:
  1. MongoDB records the change in the oplog.
  2. If there is an active change stream watching for changes, MongoDB pushes the event to the change stream client.
  3. The client can then process the event and take appropriate action.
Change Streams Require a Replica Set
  • change streams in MongoDB are only supported on replica sets or sharded clusters. A standalone MongoDB server does not support change streams or the watch() method.
  • Change streams rely on the oplog (operation log) used in replication. The oplog is a special capped collection in MongoDB that records all changes made to the data in the database. The oplog is only maintained in replica sets, as it's primarily used to replicate changes to other members of the replica set.
  • Standalone MongoDB instances don't maintain an oplog because there are no other nodes to replicate data to, hence change streams can't be used on standalone servers.
How to Enable Change Streams: Set Up a Replica Set Locally
  • If you want to experiment with change streams on your local machine, you can convert your MongoDB instance into a replica set. This is a common way to test features like change streams during development.
Steps to Set Up a Replica Set Locally
  • Stop your running MongoDB instance (if applicable).
Start MongoDB with Replica Set Configuration:
  • You need to start MongoDB with the --replSet option to enable replica set functionality.
  • If you're using the MongoDB shell to start MongoDB, run this command:

    mongod --replSet rs0

  • This command starts MongoDB with the replica set name rs0.
Initialize the Replica Set:
  • Once MongoDB is running, open a new MongoDB shell and run the following command to initiate the replica set:

    rs.initiate()

  • This command initializes the replica set configuration for the MongoDB instance. You should see output like this:

    {
        "ok": 1,
        "operationTime": Timestamp(1627647186, 1),
        "$clusterTime": {
            "clusterTime": Timestamp(1627647186, 1),
            "signature": {
                "hash": BinData(0, ""),
                "keyId": NumberLong("0")
            }
        }
    }

  • Verify the Replica Set: To ensure your MongoDB instance is now part of a replica set, you can run the following command in the shell:

    rs.status()

  • This will display the status of your replica set, and if everything is correct, you should see that your instance is now a primary node in the replica set.
Now, Running Change Streams
  • Once your MongoDB instance is running as part of a replica set, you can open a change stream.
  • Conclusion: If you want to use change streams, you must run MongoDB as part of a replica set. For local development, it's quite easy to set up a single-node replica set using the --replSet option. After setting up the replica set, you’ll be able to use features like change streams without encountering errors.
Basic Example of a Change Stream
  • Let’s see how we can use a change stream in the MongoDB shell to listen to changes in a collection called orders.
  • Step 1: Setup the orders Collection

    use ecommerce

    db.orders.insertMany([
        { order_id: 101, customer_name: "John Doe", total: 1000 },
        { order_id: 102, customer_name: "Jane Smith", total: 500 }
    ])

  • This creates a collection named orders with two initial documents.
  • Step 2: Open a Change Stream to Watch for Changes
  • To listen for changes in the orders collection, you can use the watch() method:

    const changeStream = db.orders.watch();

  • This opens a change stream that listens for changes in the orders collection.
  • Step 3: Simulating Some Changes
  • In a different MongoDB shell tab or process, you can modify the orders collection:

    // Insert a new order
    db.orders.insertOne({
        order_id: 103, customer_name: "Alice Cooper", total: 750
    });

    // Update an existing order
    db.orders.updateOne({ order_id: 101 }, { $set: { total: 1200 } });

    // Delete an order
    db.orders.deleteOne({ order_id: 102 });

  • Step 4: Receiving and Handling Changes
  • The change stream will now capture the changes and output them in the first shell where you started the change stream.

    changeStream.forEach(change => printjson(change));

  • You might see the following output:

    {
        "operationType": "insert",
        "fullDocument": {
            "_id": ObjectId("64c23ef349123abf12abcd45"),
            "order_id": 103,
            "customer_name": "Alice Cooper",
            "total": 750
        },
        "ns": {
            "db": "ecommerce",
            "coll": "orders"
        },
        "documentKey": {
            "_id": ObjectId("64c23ef349123abf12abcd45")
        }
    }
   
    {
        "operationType": "update",
        "documentKey": {
            "_id": ObjectId("64c23ef349123abf12abcd34")
        },
        "updateDescription": {
            "updatedFields": {
                "total": 1200
            },
            "removedFields": []
        }
    }
   
    {
        "operationType": "delete",
        "documentKey": {
            "_id": ObjectId("64c23ef349123abf12abcd36")
        }
    }


Explanation of Output:

  • Insert Operation:
    • A new order for Alice Cooper was inserted with a total of 750.
    • operationType: "insert" indicates that a new document was added.
    • fullDocument contains the new document inserted into the orders collection.
  • Update Operation:
    • The order for John Doe (order ID: 101) was updated, and the total field was changed to 1200.
    • operationType: "update" indicates that an existing document was updated.
    • updateDescription.updatedFields shows which fields were modified (total in this case).
  • Delete Operation:
    • The order for Jane Smith (order ID: 102) was deleted.
    • operationType: "delete" indicates that a document was deleted.
    • documentKey provides the _id of the deleted document.
Change Streams with Filters and Aggregation
  • You can apply filters and use aggregation pipelines with change streams to monitor specific types of changes or perform more complex operations on the incoming data.
  • For example, if you only want to listen for insert operations, you can apply a filter like this:

    const pipeline = [
        { $match: { operationType: "insert" } }
    ];

    const changeStream = db.orders.watch(pipeline);

    changeStream.forEach(change => printjson(change));


In this example:

  • We use the $match stage to filter the stream and only receive documents where the operationType is insert.
  • Now, the change stream will only output changes related to insert operations, ignoring updates and deletions.
More Complex Aggregation Example
  • You can also use more advanced aggregation stages in your change stream. For instance, let’s say you want to monitor changes but only care about orders where the total exceeds 1000 after the update:

    const pipeline = [
        {
            $match: {
                $or: [
                    { "fullDocument.total": { $gt: 1000 } },
                    { "updateDescription.updatedFields.total": { $gt: 1000 } }
                ]
            }
        }
    ];

    const changeStream = db.orders.watch(pipeline);

    changeStream.forEach(change => printjson(change));


Here:

  • The change stream will listen for any inserts or updates where the total is greater than 1000.
  • For insert operations, the fullDocument.total is checked.
  • For update operations, the updateDescription.updatedFields.total is checked.
Resuming Change Streams
  • If your application loses its connection to MongoDB or needs to restart, you can resume the change stream from the last received event using a resume token. This ensures that you don’t miss any changes while the connection was down.
How to Use Resume Tokens
  • When MongoDB emits a change event, it includes a special field called resumeToken. You can store this token and use it later to resume the change stream from where it left off.

    let resumeToken = null;

    const changeStream = db.orders.watch();

    changeStream.forEach(change => {
        printjson(change);
        resumeToken = change._id;  // Store the resume token
    });

  • Later, if your application needs to restart:

    const changeStream = db.orders.watch([], { resumeAfter: resumeToken });

    changeStream.forEach(change => printjson(change));

  • This ensures that you don’t miss any events between the initial disconnection and the reconnection.
Use Cases for Change Streams
  • Real-Time Notifications: You can notify users when a document (e.g., an order or a product) changes, triggering emails or notifications.
  • Audit Trails: Use change streams to track modifications to critical documents, creating audit logs for compliance or security purposes.
  • Cache Synchronization: Keep an in-memory cache synchronized with changes happening in the MongoDB database.
  • Data Replication: Synchronize data across different databases or systems by listening for changes and replicating them in real time.
  • Analytics & Monitoring: Monitor data in real time for anomalies, trends, or other events requiring immediate attention.
Conclusion
  • Change streams are a powerful feature of MongoDB that allow you to watch and react to changes in your collections in real-time. They provide an efficient way to build event-driven systems, ensuring that you can keep data in sync across different systems, notify users of important changes, or even maintain audit trails with minimal performance impact.
  • By applying aggregation pipelines and resume tokens, you can customize change streams to meet specific business needs and handle temporary disconnections gracefully.

Debouncing and Throttling in JavaScript

Debouncing and Throttling - Made Simple! Think of these as traffic controllers for your functions: Debouncing = Wait until user stops, ...