Wednesday, September 16, 2009

What is the role of banks and financial institutions in America?

What is the role of banks and financial institutions in America?

The role of banks is to lend money so that people can buy houses, a car or something they want. When the banks lend you money you pay a little money each month and at the end you end up paying more then you borrowed.

No comments:

Post a Comment