Monday, October 18, 2010

More is too more

In 1946 only one programmable electronic computer existed in the entire world. A few years later dozens existed; by the 1960s, hundreds. These computers were still so fearsomely expensive that developers worked hard to minimize the resources their programs consumed.

Though the microprocessor caused the price of compute cycles to plummet, individual processors still cost many dollars. By the 1990s, companies such as Microchip and Zilog were already selling complete microcontrollers for sub-dollar prices. For the first time most embedded applications could cost-effectively exploit partitioning into multiple CPUs. However, few developers actually do this; the non-linear schedule/LoC curve is nearly unknown in embedded circles.

Today's cheap transistor-rich ASICs coupled with tiny 32-bit processors provided as "soft" IP means designers can partition their programs over multiple—indeed many—processors to dramatically accelerate development schedules. Faster product delivery means lower engineering costs. When NRE is properly amortized over production quantities, lower engineering costs translate into lower cost of goods. The customer gets that cool new electronic goody faster and cheaper.

Or, you can continue to build huge monolithic programs running on a single processor, turning a win-win situation into one where everyone loses.esp

Some Other benefits

Smaller systems contain fewer bugs, of course, but they also tend to have a much lower defect rate. A recent study by Chu, Yang, Chelf, and Hallem evaluated Linux version 2.4.9 In this released and presumably debugged code, an automatic static code checker identified many hundreds of mistakes. Error rates for big functions were two to six times higher than for smaller routines. Partition to accelerate the schedule, and ship a higher quality product.

NIST (the National Institute of Standards and Technology) found that poor testing accounts for some $22 billion in software failures each year.10 Testing is hard; as programs grow the number of execution paths explodes. Robert Glass estimates that for each 25% increase in program size, the program complexity—represented by paths created by function calls, decision statements and the like—doubles.11

Standard software-testing techniques simply don't work. Most studies find that conventional debugging and QA evaluations find only half the program bugs.

Partitioning the system into smaller chunks, distributed over multiple CPUs, with each having a single highly cohesive function, reduces the number of control paths and makes the system inherently more testable.

In no other industry can a company ship a poorly-tested product, often with known defects, without being sued. Eventually the lawyers will pick up the scent of fresh meat in the firmware world.

Partition to accelerate the schedule, ship a higher quality product and one that's been properly tested. Reduce the risk of litigation.

Adding processors increases system performance, not surprisingly simultaneously reducing development time. A rule of thumb states that a system loaded to 90% processor capability doubles development time (versus one loaded at 70% or less).12 At 95% processor loading, expect the project schedule to triple. When there's only a handful of bytes left over, adding even the most trivial new feature can take weeks as the developers rewrite massive sections of code to free up a bit of ROM. When CPU cycles are in short supply an analogous situation occurs.

The superprogrammer effect

Developers come in all sorts of flavors, from somewhat competent plodders to miracle workers who effortlessly create beautiful code in minutes. Management's challenge is to recognize the superprogrammers and use them efficiently. Few bosses do; the best programmers get lumped with relative dullards to the detriment of the entire team.
Big projects wear down the superstars. Their glazed eyes reflect the meeting and paperwork burden; their creativity is thwarted by endless discussions and memos.

The moral is clear and critically important: wise managers put their very best people on the small sections partitioned off of the huge project. Divide your system over many CPUs and let the superprogrammers attack the smallest chunks.

Though most developers view themselves as superprogrammers, competency follows a bell curve. Ignore self-assessments. In "Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments," Justin Kruger and David Dunning showed that though the top half of performers were pretty accurate in evaluating their own performance, the bottom half are wildly optimistic when rating themselves.

Reduce NRE, save big bucks

Hardware designers will shriek when you propose adding processors just to accelerate the software schedule. Though they know transistors have little or no cost, the EE zeitgeist is to always minimize the bill of materials. Yet since the dawn of the microprocessor age, it has been routine to add parts just to simplify the code. No one today would consider building a software UART, though it's quite easy to do and wasn't terribly uncommon decades ago. Implement asynchronous serial I/O in code and the structure of the entire program revolves around the software UART's peculiar timing requirements. Consequently, the code becomes a nightmare. So today we add a hardware UART without complaint. The same can be said about timers, pulse-width modulators, and more. The hardware and software interact as a synergistic whole orchestrated by smart designers who optimize both product and engineering costs.

Sum the hardware component prices, add labor and overhead, and you still haven't properly accounted for the product's cost. Non-recurring engineering (NRE) is just as important as the price of the printed circuit board. Detroit understands this. It can cost more than $2 billion to design and build the tooling for a new car. Sell one million units and the consumer must pay $2,000 above the component costs to amortize those NRE bills.

Similarly, when we in the embedded world save NRE dollars by delivering products faster, we reduce the system's recurring cost. Everyone wins.

Sure, there's a cost to adding processors, especially when doing so means populating more chips on the printed circuit board. But transistors are particularly cheap inside of an ASIC. A full 32-bit CPU can consume as little as 20,000 to 30,000 gates. Interestingly, users of configurable processor cores average about six 32-bit processors per ASIC, with at least one customer's chip using more than 180 processors! So if time to market is really important to your company (and when isn't it?), if the code naturally partitions well, and if CPUs are truly cheap, what happens when you break all the rules and add lots of processors? Using the COCOMO numbers a one million LoC program divided over 100 CPUs can be completed five times faster than using the traditional monolithic approach, at about one-fifth the cost.

Extreme partitioning is the silver bullet to solving the software productivity crisis.

Intel's pursuit of massive performance gains has given the industry a notion that processors are big, ungainly, transistor-consuming, heat-dissipating chips. That's simply not true; the idea only holds water in the strange byways of the desktop world. An embedded 32-bitter is only about an order of magnitude larger than the 8080, the first really useful processor introduced in 1974.

Obviously, not all projects partition as cleanly as those described here. Some do better, when handling lots of fast data in a uniform manner. The same code might live on many CPUs. Others aren't so orthogonal. But only a very few systems fail to benefit from clever partitioning.

Cheating the schedule's exponential growth

The upper curve in Figure 3 is the COCOMO schedule; the lower one assumes we're building systems at the 20-KLoC/program productivity level. The schedule, and hence costs, grow linearly with increasing program size.

But how can we operate at these constant levels of productivity when programs exceed 20 KLoC? The answer: by partitioning! And by understanding the implications of Brook's Law and DeMarco and Lister's study. The data is stark: if we don't compartmentalize the project, divide it into small chunks that can be created by tiny teams working more or less in isolation, schedules will balloon exponentially.

Professionals look for ways to maximize their effectiveness. As professional software engineers we have a responsibility to find new partitioning schemes. Currently 70 to 80% of a product's development cost is consumed by the creation of code and that ratio is only likely to increase because firmware size doubles about every 10 months. Software will eventually strangle innovation without clever partitioning.

Academics drone endlessly about top-down decomposition (TDD), a tool long used for partitioning programs. Split your huge program into many independent modules; divide these modules into functions, and only then start cranking code.

TDD is the biggest scam ever perpetrated on the software community.

Though TDD is an important strategy that allows wise developers to divide code into manageable pieces, it's not the only tool we available for partitioning programs. TDD is merely one arrow in the quiver, never used to the exclusion of all else. We in the embedded systems industry have some tricks unavailable to IT programmers.

One way to keep productivity levels high—at, say, the 20-KLoC/program number—is to break the program up into a number of discrete entities, each no more than 20KLoC long. Encapsulate each piece of the program into its own processor.

That's right: add CPUs merely for the sake of accelerating the schedule and reducing engineering costs.

Sadly, most 21st-century embedded systems look an awful lot like mainframe computers of yore. A single CPU manages a disparate array of sensors, switches, communications links, PWMs, and more. Dozens of tasks handle many sorts of mostly unrelated activities. A hundred thousand lines of code all linked into a single executable enslaves dozens of programmers all making changes throughout a Byzantine structure no one completely comprehends. Of course development slows to a crawl.

Transistors are cheap. Developers are expensive.

Break the system into small parts, allocate one partition per CPU, and then use a small team to build each subsystem. Minimize interactions and communications between components and between the engineers.

Suppose the monolithic, single-CPU version of the product requires 100K lines of code. The COCOMO calculation gives a 1,403 man-month development schedule.

Segment the same project into four processors, assuming one has 50KLoC and the others 20KLoC each. Communication overhead requires a bit more code so we've added 10% to the 100-KLoC base figure. The schedule collapses to 909 man-months, or 65% of that required by the monolithic version.

Maybe the problem is quite orthogonal and divides neatly into many small chunks, none being particularly large. Five processors running 22 KLoC each will take 1,030 man-months, or 73% of the original, not-so-clever design.

Transistors are cheap—so why not get crazy and toss in lots of processors? One processor runs 20 KLoC and the other nine each run 10-KLoC programs. The resulting 724 man-month schedule is just half of the single-CPU approach. The product reaches consumers' hands twice as fast and development costs tumble. You're promoted and get one of those hot foreign company cars plus a slew of appreciating stock options. Being an engineer was never so good.

The productivity collapse

(Most developers rebel at this point. "I can crank out a thousand lines of code over the weekend!" And no doubt that's true. However, these numbers reflect costs over the entire development cycle, from inception to shipping. Maybe you are a superprogrammer and consistently code much faster than, say, 200 lines of code per month. Even so, that only shifts the curve up or down on the graph. The shape of the curve, the exponential loss of productivity, is undeniable.)

Computer science professors show their classes graphs like the one in Figure 2 to terrify their students. The numbers are indeed scary. A million LoC project sentences us to the 32 LoC/month chain gang. We can whip out a small system over the weekend but big ones take years.

Or do they?

COCOMO

Barry Boehm, the father of software estimation, derived what he calls the Constructive Cost Model, or COCOMO, for determining the cost of software projects.5 Though far from perfect, COCOMO is predictive, quantitative, and probably the most well-established model extant.

Boehm defines three different development modes: organic, semidetached, and embedded, where "embedded" means a software project built under tight constraints in a complex of hardware, software, and sometimes regulations. Though he wasn't thinking of firmware as we know it, this is a pretty good description of a typical embedded system.

Under COCOMO the number of man-months (MM) required to deliver a system developed in the "embedded"
where KSLoC is the number of lines of source code in thousands, and Fi are 15 different cost drivers.

Cost drivers include factors such as required reliability, product complexity, real time constraints, and more. Each cost driver is assigned a weight that varies from a little under 1.0 to a bit above. It's reasonable for a first approximation to assume these cost driver figures all null to about 1.0 for typical projects.

This equation's intriguing exponent dooms big projects. Schedules grow faster than code size does. Double the project's size and the schedule will grow by more than a factor of two—sometimes far more.

Despite his unfortunate eponymous use of the word to describe a development mode, Boehm never studied real-time embedded systems as we know them today so there's some doubt about the validity of his exponent of 1.20. Boehm used American Airlines' Sabre reservation system as a prime example of a real-time application. Users wanted an answer "pretty fast" after hitting the enter key. In the real embedded world where missing a deadline by even a microsecond means the 60 Minutes crew appears on the doorstep with multiple indictments, managing time constraints burns through the schedule at a prodigious rate.

Fred Brooks believes the exponent should be closer to 1.5 for real-time systems. Anecdotal evidence from some dysfunctional organizations gives a value closer to 2. The rule of thumb becomes: double the code and multiply man-months by four. Interestingly, that's close to the nearly n2 number of communication channels between engineers noted before.

Let's pick an intermediate and conservative value, 1.35, which sits squarely between Boehm's and Brooks' estimate and is less than most anecdotal evidence suggests. Figure 2 shows how productivity collapses as the size of the program grows.

Correlating interruptions with productivity

The results? The top 25% were 260% more productive than the bottom quartile!

The lesson here is that interruptions kill software productivity, mirroring Joel Aron's results. Other work has shown it takes the typical developer 15 minutes to get into a state of "flow," where furiously typing fingers create a wide-bandwidth link between the programmer's brain and the computer. Disturb that concentration with an interruption and the link fails. It takes 15 minutes to rebuild that link but, on average, developers are interrupted every 11 minutes.4

Interruptions are the scourge of big projects.

A maxim of software engineering is that functions should be strongly cohesive but only weakly coupled. Development teams invariably act in the opposite manner. The large number of communication channels makes the entire group highly coupled. Project knowledge is distributed in many brains. Joe needs information about an API call. Mary is stuck finding a bug in the interface with Bob's driver. Each jumps up and disturbs a concentrating team member, destroying that person's productivity.

Subtract software costs by adding CPUs

In 1946 programmers created software for the ENIAC machine by rewiring plug-boards. Two years later the University of Manchester's Small-Scale Experimental Machine, nicknamed Baby, implemented von Neumann's stored program concept, for the first time supporting a machine language. Assembly language soon became available and flourished. But in 1957 Fortran, the first high-level language, debuted and forever changed the nature of programming.

In 1964, Dartmouth BASIC introduced millions of (comparatively) non-techies to the wonders of computing while forever poisoning their programming skills. Three years later, almost as a counterpoint, OOP (object-oriented programming) appeared in the guise of Simula 67. The C language, still the standard for embedded development, and C++ appeared in 1969 and 1985, respectively.

By the 1990s, a revolt against big, up-front design led to a flood of new "agile" programming methods including eXtreme Programming (XP), SCRUM, Test-Driven Development, Feature-Driven Development, the Rational Unified Process, and dozens more.

In the 50 years since programming first appeared, software engineering has morphed to something that would be utterly alien to the software developer of 1946. That half-century has taught us a few pivotal lessons about building programs. Pundits might argue that the most important might be the elimination of goto, the use of objects, or building from patterns.

They'd be wrong. The fundamental insight of software engineering is to keep things small. Break big problems into little ones.

For instance, we understand beyond a shadow of a doubt the need to minimize function sizes. No one is smart enough to understand, debug, and maintain a 1,000-line routine, at least not in an efficient manner. Consequently, we've learned to limit our functions to around 50 lines of code (LoC). Reams of data prove that restricting functions to a page of code or less reduces bug rates and increases productivity.

But why is partitioning so important?

A person's short-term memory is rather like cache—a tiny cache, actually—that can hold only five to nine things before new data flushes the old. Big functions blow the programmer's mental cache. The programmer can no longer totally understand the code; errors proliferate.

Wednesday, October 6, 2010

New Technology

Sunday, September 26, 2010

General Ledger & Cashbook

Enter Your Company Name

Go to the File menu and select Configure. Then click on the Global tab. Enter your company name in the Company Name field. (This is the name that will appear at the top of each printed report.) Then click on the OK button.


Post an Entry

Click on Entries in the dark grey navigation area on the left. This will take you to the Entries screen.

To begin a new entry click on the New button at the bottom. The screen will switch to New Entry mode. This is displayed in the bottom left corner of the screen.

Enter the date for the entry in the Date field at the top. Then select the first account from the drop-down list labelled Account.

You will notice that a new line appears at the top of the grid. The grid is used for viewing only. All data must be entered using the fields at the top of the screen above the grid.

Next enter the details in the Description field, the amount in the Amount field and indicate if it is a Debit or Credit.

Press the key to go to the next line (or click on the blank line at the bottom of the grid) and repeat the process. When you are finished click on the Save button to save the entry to the database. You will not be able to save an entry unless the total debits equal the total credits.

To edit an entry click on the Edit button to switch the screen to Edit Mode. Then click on the line in the grid that you wish to alter. When you are finished click on the Save button to save the changes to the database. Click Cancel to discard any changes. Click Delete to delete the entire entry from the database.

View an Account

Click on Accounts in the dark grey navigation area on the left. This will take you to the Accounts screen.

Select the account you wish to view from the drop-down list. The entries for the selected account appear in the grid.

To view the entire entry double-click on it in the grid. This will take you directly to the Entries screen and display the entire entry. You can then double-click on a line in the entry grid to get back to the Accounts screen and view the account for that line.

Create a Cashbook

Go to the Maintain menu and select Cashbooks. To add a new cashbook use the key on your keyboard.

In the Account field enter the abbreviation of the account you wish to associate with this cashbook or double-click to select it from a list.

In the Abbrev field enter the abbreviation you wish to use to identify this cashbook. Often this will be the same as the abbreviation of the account with which the cashbook is associated but it can be different.

Enter the name of the cashbook in the Name field. Again this will likely be the same as the name of the associated account but this is not necessarily so.

Optionally enter a description of the cashbook in the Description field. This is to remind you of the purpose for the cashbook.

To edit the details of a cashbook click on the field you wish to edit and then click it a second time (as distinct from a double-click.)

To delete a cashbook hold down the key and press the key.

Make an Entry in a Cashbook

Click on Cashbooks in the dark grey navigation area on the left. This will take you to the Cashbooks screen.

Select the cashbook you wish to make an entry into from the drop-down list. The entries for the selected cashbook appear in the grid.

To begin a new entry click on the New button at the bottom. The screen will switch to New Entry mode.

Enter the date for the entry in the Date field at the bottom. Then enter the details in the Description field, the amount in the Amount field and indicate if it is a Debit or Credit. Select an account from the Other A/c drop-down list only if you wish to automatically create a corresponding entry in the ledger. Indicate if this entry has appeared on the bank statement by putting a tick in the check box next to On Statement. Finally save this entry to the database by clicking on the Save button or pressing the key on your keyboard.

To edit an entry click on the Edit button to switch the screen to Edit Mode. When you are finished click on the Save button to save the changes to the database. Click Cancel to discard any changes. Click Delete to delete the entire entry from the database.

Reconcile a Cashbook with a Bank Statement

Click on Cashbooks in the dark grey navigation area on the left. This will take you to the Cashbooks screen.

Select the cashbook you wish to reconcile with from the drop-down list. The entries for the selected cashbook appear in the grid.

For those entries in the cashbook which have matching transactions on the bank statement put a tick in the check box next to On Statement. (You can toggle this on and off for an entry by double-clicking on it in the grid.)

For those transactions on the bank statement which do not have a matching entry in the cashbook create a new entry in the cashbook and put a tick in the check box next to On Statement.

The amount next to Stmt Balance at the top of the screen after the arrow (-->) is the closing statement balance. This should match the closing balance shown on the bank statement.

The amounts next to Balance are the opening and closing balances of the cashbook including unreconciled entries (Entries which do not have a tick next to On Statement.) The amounts next to A/c Balance are the opening and closing balances of the associated ledger account.

View or Print a Balance Sheet

Click on Balance in the dark grey navigation area on the left. This will take you to the Balance Sheet screen.

Enter the date of the balance sheet in the As At field at the bottom.

Click on the Print button to display a preview. To print click on the printer icon at the top.

Change the Layout of the Balance Sheet

Click on Balance in the dark grey navigation area on the left. This will take you to the Balance Sheet screen.

The balance sheet (A, L and P) accounts are shown in the tree view along with the other elements used on the balance sheet including headings, groups, subtotals, totals and a special Retained Earnings element. You can change the order of the elements by dragging and dropping them. To include an account or the Retained Earnings element in a group drag and drop it onto the group.

To create a new heading, group or subtotal click on the Heading, Group or Subtotal buttons to the right of the tree view. To rename an element click on the Edit button. To delete an element click on the Delete button.

View or Print an Income Statement

Click on Income in the dark grey navigation area on the left. This will take you to the Income Statement screen.

Enter the begin and end dates of the income statement in the Begin Period and End Period fields at the bottom. These can be turned on or off by putting or removing a tick in the check box above each field. If you do not specify a begin date then the income statement will include all entries in the ledger from the earliest entered date. If you do not specify an end date then all entries in the ledger up to the latest entered date will be included.

Click on the Print button to display a preview. To print it click on the printer icon at the top.

Change the Layout of the Income Statement

Click on Income in the dark grey navigation area on the left. This will take you to the Income Statement screen.

The income statement (I and E) accounts are shown in the tree view along with the other elements used on the income statement including headings, groups, subtotals, totals and a special Net Income element. You can change the order of the elements by dragging and dropping them. To include an account in a group drag and drop it onto the group. (The Net Income element cannot be moved from the last position. It must always remain at the bottom.)

To create a new heading, group or subtotal click on the Heading, Group or Subtotal buttons to the right of the tree view. To rename an element click on the Edit button. To delete an element click on the Delete button.

IMPORTANT NOTES

When you select an account from the Other A/c drop-down list a corresponding double-entry is automatically posted to the ledger and linked to this entry in the cashbook. If you make any changes to this entry in the cashbook the double-entry in the ledger is automatically updated to reflect the changes. If you delete the entry from the cashbook the ledger double-entry is also deleted automatically.

Note that the converse does not apply. In other words if you edit or delete the double-entry in the ledger no changes are automatically made to the cashbook entry. This is important because it allows for flexibility in editing the general ledger without affecting the cashbook and its reconciliation to a bank statement.

Note also that it is not required to specify an Other A/c when making a cashbook entry. In this case no ledger double-entry is posted. This is useful for example when setting up a new cashbook and you wish to enter the opening balance to match the balance shown on the bank statement.

Accounting Software

General Ledger & Cashbook

Ledger is a FREE accounting system for any organization that needs a general ledger or cashbook. Because it is incredibly easy to install and use, it will also appeal to students of double-entry bookkeeping.

Ledger provides unparalleled flexibility. All account balances are calculated dynamically so that the standard accounting reports can be created for any arbitrary date or period without the need for a period close or roll-over.

Ledger has received more than 150,000 downloads since it was first made available in December 2004 without A SINGLE BUG REPORT! For more information please see our FAQ page or the discussions in the Responsive Software Google Group.

Wednesday, September 1, 2010

Personal Accounting

Mainly for home users that use accounts payable type accounting transactions, managing budgets and simple account reconciliation at the inexpensive end of the market suppliers include:

Low End

At the low end of the business markets, inexpensive applications software allows most general business accounting functions to be performed. Suppliers frequently serve a single national market, while larger suppliers offer separate solutions in each national market. Many of the low end products are characterized by being "single-entry" products, as opposed to double-entry systems seen in many businesses. Some products have considerable functionality but are not considered GAAP or IFRS/FASB compliant. Some low-end systems do not have adequate security nor audit trails.

Mid Market

The mid-market covers a wide range of business software that may be capable of serving the needs of multiple national accountancy standards and allow accounting in multiple currencies. In addition to general accounting functions, the software may include integrated or add-on management information systems, and may be oriented towards one or more markets, for example with integrated or add-on project accounting modules.

Hybrid Solutions

As technology improves, software vendors have been able to offer increasingly advanced software at lower prices. This software is suitable for companies at multiple stages of growth. Many of the features of Mid Market and High End software (including advanced customization and extremely scalable databases) are required even by small businesses as they open multiple locations or grow in size. Additionally, with more and more companies expanding overseas or allowing workers to home office, many smaller clients have a need to connect multiple locations. Their options are to employ software-as-a-service or another application that offers them similar accessibility from multiple locations over the internet.

Accounting software

Accounting soft

ware is application software that records and processes accounting transactions within functional modules such as accounts payable, accounts receivable, payroll, and trial balance. It functions as an accounting information system. It may be developed in-house by the company or organization using it, may be purchased from a third party, or may be a combination of a third-party application software package with local modifications. It varies greatly in its complexity and cost. The market has been undergoing considerable consolidation since the mid 1990s, with many suppliers ceasing to trade or being bought by larger groups.

Wednesday, February 10, 2010

Foreign exchange controls

Foreign exchange controls are various forms of controls imposed by a government on the purchase/sale of foreign currencies by residents or on the purchase/sale of local currency by nonresidents.
Common foreign exchange controls include:
Banning the use of foreign currency within the country
Banning locals from possessing foreign currency
Restricting currency exchange to government-approved exchangers
Fixed exchange rates
Restrictions on the amount of currency that may be imported or exported
Countries with foreign exchange controls are also known as "Article 14 countries," after the provision in the International Monetary Fund agreement allowing exchange controls for transitional economies. Such controls used to be common in most countries, particularly poorer ones, until the 1990s when free trade and globalization started a trend towards economic liberalization. Today, countries which still impose exchange controls are the exception rather than the rule.

Retail foreign exchange brokers

Retail traders (individuals) constitute a growing segment of this market, both in size and importance. Currently, they participate indirectly through brokers or banks. Retail brokers, while largely controlled and regulated in the USA by the CFTC and NFA have in the past been subjected to periodic foreign exchange scams.[8][9] To deal with the issue, the NFA and CFTC began (as of 2009) imposing stricter requirements, particularly in relation to the amount of Net Capitalization required of its members. As a result many of the smaller, and perhaps questionable brokers are now gone.
There are two main types of retail FX brokers offering the opportunity for speculative currency trading: brokers and dealers or market makers. Brokers serve as an agent of the customer in the broader FX market, by seeking the best price in the market for a retail order and dealing on behalf of the retail customer. They charge a commission or mark-up in addition to the price obtained in the market. Dealers or market makers, by contrast, typically act as principal in the transaction versus the retail customer, and quote a price they are willing to deal at—the customer has the choice whether or not to trade at that price.
In assessing the suitability of a FX trading services, the customer should consider the ramifications of whether the service provider is acting as principal or agent. When the service provider acts as agent, the customer is generally assured of a known cost above the best inter-dealer FX rate. When the service provider acts as principal, no commission is paid, but the price offered may not be the best available in the market—since the service provider is taking the other side of the transaction, a conflict of interest may occur.

Market participants

Unlike a stock market, the foreign exchange market is divided into levels of access. At the top is the inter-bank market, which is made up of the largest commercial banks and securities dealers. Within the inter-bank market, spreads, which are the difference between the bid and ask prices, are razor sharp and usually unavailable, and not known to players outside the inner circle. The difference between the bid and ask prices widens (from 0-1 pip to 1-2 pips for some currencies such as the EUR). This is due to volume. If a trader can guarantee large numbers of transactions for large amounts, they can demand a smaller difference between the bid and ask price, which is referred to as a better spread. The levels of access that make up the foreign exchange market are determined by the size of the "line" (the amount of money with which they are trading). The top-tier inter-bank market accounts for 53% of all transactions. After that there are usually smaller banks, followed by large multi-national corporations (which need to hedge risk and pay employees in different countries), large hedge funds, and even some of the retail FX-metal market makers. According to Galati and Melvin, “Pension funds, insurance companies, mutual funds, and other institutional investors have played an increasingly important role in financial markets in general, and in FX markets in particular, since the early 2000s.” (2004) In addition, he notes, “Hedge funds have grown markedly over the 2001–2004 period in terms of both number and overall size” Central banks also participate in the foreign exchange market to align currencies to their economic needs.

Market size and liquidity

The foreign exchange market is the largest and most liquid financial market in the world. Traders include large banks, central banks, currency speculators, corporations, governments, and other financial institutions. The average daily volume in the global foreign exchange and related markets is continuously growing. Daily turnover was reported to be over US$3.2 trillion in April 2007 by the Bank for International Settlements. [2] Since then, the market has continued to grow. According to Euromoney's annual FX Poll, volumes grew a further 41% between 2007 and 2008.[3]
Of the $3.98 trillion daily global turnover, trading in London accounted for around $1.36 trillion, or 34.1% of the total, making London by far the global center for foreign exchange. In second and third places respectively, trading in New York accounted for 16.6%, and Tokyo accounted for 6.0%.[4] In addition to "traditional" turnover, $2.1 trillion was traded in derivatives.
Exchange-traded FX futures contracts were introduced in 1972 at the Chicago Mercantile Exchange and are actively traded relative to most other futures contracts.

Foreign exchange market

The foreign exchange market (forex, FX, or currency market) is a worldwide decentralized over-the-counter financial market for the trading of currencies. Financial centers around the world function as anchors of trading between a wide range of different types of buyers and sellers around the clock, with the exception of weekends.
The purpose of the foreign exchange market is to assist international trade and investment. The foreign exchange market allows businesses to convert one currency to another. For example, it permits a U.S. business to import European goods and pay Euros, even though the business's income is in U.S. dollars. Some experts, however, believe that the unchecked speculative movement of currencies by large financial institutions such as hedge funds impedes the markets from correcting global current account imbalances. This carry trade may also lead to loss of competitiveness in some countries. [1]
In a typical foreign exchange transaction a party purchases a quantity of one currency by paying a quantity of another currency. The modern foreign exchange market started forming during the 1970s when countries gradually switched to floating exchange rates from the previous exchange rate regime, which remained fixed as per the Bretton Woods system.
The foreign exchange market is unique because of
trading volume resulting in market liquidity
geographical dispersion
continuous operation: 24 hours a day except weekends, i.e. trading from 20:15 UTC on Sunday until 22:00 UTC Friday
the variety of factors that affect exchange rates
the low margins of relative profit compared with other markets of fixed income
the use of leverage to enhance profit margins with respect to account size

Use of High Leverage

By offering high leverage, the market maker encourages traders to trade extremely large positions. This increases the trading volume cleared by the market maker and increases his profits, but increases the risk that the trader will receive a margin call. While professional currency dealers (banks, hedge funds) never use more than 10:1 leverage, retail clients are generally offered leverage between 50:1 and 200:1[2].
A self-regulating body for the foreign exchange market, the National Futures Association, warns traders in a forex training presentation of the risk in trading currency. “As stated at the beginning of this program, off-exchange foreign currency trading carries a high level of risk and may not be suitable for all customers. The only funds that should ever be used to speculate in foreign currency trading, or any type of highly speculative investment, are funds that represent risk capital; in other words, funds you can afford to lose without affecting your financial situation.

Forex

A forex (or foreign exchange) scam is any trading scheme used to defraud traders by convincing them that they can expect to gain a high profit by trading in the foreign exchange market. Currency trading "has become the fraud du jour" as of early 2008, according to Michael Dunn of the U.S. Commodity Futures Trading Commission.[1] But "the market has long been plagued by swindlers preying on the gullible," according to the New York Times.[2] "The average individual foreign-exchange-trading victim loses about $15,000, according to CFTC records" according to The Wall Street Journal.[3] The North American Securities Administrators Association says that "off-exchange forex trading by retail investors is at best extremely risky, and at worst, outright fraud."[4]
"In a typical case, investors may be promised tens of thousands of dollars in profits in just a few weeks or months, with an initial investment of only $5,000. Often, the investor’s money is never actually placed in the market through a legitimate dealer, but simply diverted – stolen – for the personal benefit of the con artists."[5]
In August, 2008 the CFTC set up a special task force to deal with growing foreign exchange fraud.”[6]
The forex market is a zero-sum game,[7] meaning that whatever one trader gains, another loses, except that brokerage commissions and other transaction costs are subtracted from the results of all traders, technically making forex a "negative-sum" game.
These scams might include churning of customer accounts for the purpose of generating commissions, selling software that is supposed to guide the customer to large profits,[8] improperly managed "managed accounts",[9] false advertising,[10] Ponzi schemes and outright fraud.[4][11] It also refers to any retail forex broker who indicates that trading foreign exchange is a low risk, high profit investment.[12]
The U.S. Commodity Futures Trading Commission (CFTC), which loosely regulates the foreign exchange market in the United States, has noted an increase in the amount of unscrupulous activity in the non-bank foreign exchange industry.[13]
An official of the National Futures Association was quoted as saying, "Retail forex trading has increased dramatically over the past few years. Unfortunately, the amount of forex fraud has also increased dramatically."[14] Between 2001 and 2006 the U.S. Commodity Futures Trading Commission has prosecuted more than 80 cases involving the defrauding of more than 23,000 customers who lost $350 million. From 2001 to 2007, about 26,000 people lost $460 million in forex frauds.[1] CNN quoted Godfried De Vidts, President of the Financial Markets Association, a European body, as saying, "Banks have a duty to protect their customers and they should make sure customers understand what they are doing.

free counters
Twitter Delicious Facebook Digg Stumbleupon Favorites More

 
Design by Free WordPress Themes | Bloggerized by Lasantha - Premium Blogger Themes | Powerade Coupons