What is the role of banks and financial institutions in America?
The role of banks is to lend money so that people can buy houses, a car or something they want. When the banks lend you money you pay a little money each month and at the end you end up paying more then you borrowed.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment