How does increasing the size of training data impact the performance of a machine learning model?

I am exploring the impact of increasing the training data size on my model’s performance in machine learning. I have already trained my model with a small dataset, and I want to know how the model’s performance changes as I increase the size of the training data. What are the key factors I should consider? Are there any metrics I should look at? Can you provide an example of how to explore the impact of increasing the training data size on model performance in Python?

I would appreciate any suggestions for optimizing my model’s performance.