Thursday, October 31, 2013

US Treasury Oct 2013 Report on International Economic and Exchange Rate Policies Summary

This summary only covers the section Key Findings of  "Report to Congress on International Economic and Exchange Rate Policies" of US Treasury, Oct 31, 2013. You can find the pdf file of the report here.

Key Findings

US Economy

 - US real GDP grew 1.8 percent (annual rate) during the first half of 2013
 - Consensus of private forecasters: 2.8 percent growth in 2014 expected
 - Growth supported by consumer spending and residential investment
     (due to improved household balance sheets, accommodative credit conditions, and faster job creation)
 - Unemployment: dropped to 7.2% as of Sep 2013, the lowest level since Nov 2008
 - Federal deficit: 7% -> 4% of GDP, the lowest level since FY2008, acting as a headwind to growth
 - The Administration remains committed to further reducing the deficit consistent with economic conditions

Global Economy

 - IMF projects: Growth of 2.9% in 2013 (down from 3.2% in 2012), 3.6% in 2014
 - Europe: Signs of the lengthy recession in the euro area has come to an end, albeit high unemployment rates
 - Emerging Market Growth: Slowed due to
    1) waning of post-crisis stimulus, 2) slowing global export demand, and 3) tighter credit conditions
 - Emerging Market Currencies: Many of them depreciated in 2013. After an initial market sell-off in May, began to differentiate among emerging market countries according to
    1) strength of their policy frameworks, and 2) their current account positions
 - Many emerging markets have strengthened their economic institutions and policy frameworks over the past decade and thus increased their resilience to shifts in capital flows.

Euro area

 - Countries with large and persistent surpluses need to take action to
    1) boost domestic demand growth, and 2) shrink their surpluses
 - Germany maintained a large current account surplus throughout the euro area financial crisis
 - In 2012, Germany's nominal current account surplus was larger than that of China
 - Germany's slow growth of domestic demand and dependence on exports have hampered global rebalancing at a time when many other euro-area countries have been under severe pressure to curb demand and compress imports in order to promote adjustment.
 - The net result: deflationary bias for the euro area.

Asia

 - Many Asian economies have tightly managed exchange rates with varying degrees of active management
 - Need for greater exchange rate flexibility, most notably in China

China

 - China RMB: appreciated modest 2.2% nominal year-to-date as of Oct 18, 2013
 - The most recent IMF Article IV, the IMF concluded that the RMB was moderately undervalued against a broad basket of currencies, and the IMF's Pilot External Sector Report estimates that the RMB was undervalued by about 5 to 10 percent on a real effective basis, as of July 2013.
 - Chinese authorities joined other G-20 members in pledging "not to target exchange rates for competitive purposes," at the Feb 2013 G-20 Finance Ministers and Central Bank Governors Meeting in Moscow.
 - China also committed to reduce the pace of reserve accumulation and increase the transparency of its exchange rate policy, at the 2012 G-20 Leaders Summit in Los Cabos, Mexico.
 - Even though official intervention fell from late 2011 ~ early 2012 due to lowered capital inflows into China, intervention resumed as these concerns receded, and PBOC and Chinese financial institutions collectively purchased a record $110 billion in foreign exchange in both Jan and Sep 2013.
 - Disclosure: China does not disclose its intervention in FX markets, but it should to increase the credibility of its monetary policy framework and to promote exchange rate and financial market transparency.

Japan: No intervention

 - Has not intervened in the FX markets in almost two years
 - Ruled out purchases of foreign assets as a monetary policy tool
 - Have refrained from public comment on the level of the exchange rate
 - Joined the G-20 statement in Feb 2013 (not targeting exchange rates for competitive purposes)
 - To support a stronger economic recovery and increase potential growth, Japan should calibrate the pace of fiscal consolidation to the recovery in domestic demand, and should increase domestic demand

Korea

 - Korean won moderately depreciated against the US dollar in the first half of 2013
 - IMF's July 2013 Pilot External Sector Report finds that Korea's real effective exchange rate remains undervalued in a range from 2 to 8 percent
 - Korean government does not public intervention data, but it should disclose shortly after an intervention
 - Estimated to have intervened to limit the pace of won appreciation and depreciation during 2013
 - Foreign exchange reserve stood at $326 billion as of end-Sep 2013

Conclusion

 - No major trading partner of US are manipulating the rate of exchange between their currency and the US dollar for purposes of preventing effective balance of payments adjustments or gaining unfair competitive advantage in international trade as identified in Section 3004 of the Act.
 - Treasury is pushing for comprehensive adherence to recent G-7 and G-20 commitments.

Sunday, September 22, 2013

A nice way to manage risk is to put some hedge on it, but a better way is to understand it better.

Saturday, July 20, 2013

Using Java classes in Matlab

Feeling restricted even after having access to all the toolbox and third party .m codes for Matlab? You can still broaden your Matlab horizon using Java classes in it. For example, let take a look how we can use LinkedList of Java in Matlab.

>> import java.util.LinkedList;
>> list = LinkedList();
>> list.add(1);
>> list.add(2);
>> list.add(3);
>> item = list.remove();
>> disp(item)
1
>> item = list.remove();
>> disp(item)
2

Thursday, July 12, 2012

Sustainable programming habits - Abstraction

Programming, or more specifically, coding is the process of stating what you want in a precise language that computer can understand. Compared to general human languages, computer programming languages have inconvenient structures for their clarity. It's because you have to tell your computer the details that you wouldn't have to tell when you were dealing with a human being.


Since you have to state every tiny detail, the coding process easily becomes a series of tedious and monotonous tasks. Therefore, making the code you once built be in a reusable form comes to be very important in order to avoid monotonous tasks as much as possible.


Then how should we write reusable code? You can achieve it by following the big rule that one logic (logical flow) should be stated only once in the code. For instance, suppose there is the code that performs  the function of buying an apple and the code for purchasing a bunch of grapes. the series of actions required in the process of 'purchasing a product' - such as comparing prices in the market, selecting one, paying the price, and receiving the product - should be the same regardless of whether the product is an apple or a bunch of grapes. Since the two processes share the same logic, they can be implemented by the same code upon abstraction of an apple and a bunch of grapes into a 'product.' If you write the code for the same logic of purchasing a product separately for two products, that task will be boring and unnecessarily take your time. Moreover, if you find there is something wrong with your logic for buying a product, you have to modify two (or possibly pretty many) different parts of code for one modification of the logic. It severly increases your chance of making mistakes without a doubt. To write proper code that fits our big rule, we can use Top-Down or Buttom-Up approach.

Top-Down approach is to group and separate the parts of the logic that is likely to be re-used before starting to write code. Although it has an advantage that you can establish a well-defined structure of the code, this can lead to unnecessarily complex code structure because actually even any tiny part of a logic can be grouped and reused. That's why I don't recommend this approach.

Bottom-Up approach is first to write code that performs the desired task without extracting any sub-logics from the logic you are about to implement, and then adjust the code when needed. To be specific, in Bottom-Up approach, when you come to have to write similar code that shares some part of the logic used previously, you will separate that intersecting sub-logic into a form such as a function, a class, or etc. Let's assume that you have written the code for purchasing a bunch of grapes. In Bottom-Up approach, when you need to write the code for buying an apple later, you will extract the code for purchasing a 'product' from the previously written code for buying a bunch of grapes. Then in the code for purchasing a bunch of grapes you will replace the whole code with the code of setting a bunch of grapes as a 'product,' and calling the function (or the code) for purchasing a product, and you will do the same for the code for purchasing an apple except that you will put in the new code the element of code that sets an apple as the 'product.'

I prefer Bottom-Up approach to Top-Down one. Top-Down approach contributes to unnecessarily complex code, and thus to the code which is hard to understand (or hard to debug). On the other hand, Bottom-Up approach allows you to write the code for immediate neccesity without hesitation, and to conduct abstraction when it's actually needed. We call the process of extracting necessary elements for abstraction from the previous code as "Refactoring." There are several Refactoring tools that can be used in Visual Studio development environment, which is largely used for C++ and C# language based program development. Unfortunately, I couldn't find a Refactoring tool for VBA, which I'm currently using very much. However, I think you can use a Refactoring tool for VB.Net as a substitute.


In short, although coding can be a tedious and monotonous job, you can minimize such monotonic tasks by using abstraction technique properly. To achieve this, I recommend Bottom-Up type coding style and I suggest you use the Bottom-Up type programming support tools, which are available for several well-known programming environments.

Friday, July 6, 2012

Time Series Divider


(download link)

This excel file divides a time series whenever the linear trend of the time series changes. To achieve this, I used my own modified version of Quandt (1960) statistic. Since Quandt type dividing requires a regression model, I used a simple linear trend regression model as the underlying model.

Tuesday, April 10, 2012

Thin Plate Spline


The results of Thin Plate Spline Smoothing with 30 sample points

I have implemented Thin Plate Spline (TPS) smoothing for constructing initial Implied Volatility Surface estimate from limited data points that has different forward-moneyness grid points for each maturity. With this feature, I have tested the performance of TPS with a sample function.

Original function: sqrt( (x-0.5)^2 + (y-0,5)^2 )
domain: [0,1]x[0,1]
Construction of the input data: I generated 30 random points in the domain and calculated the theoretical function values of them.

After applying thin plate spline to 30 sample input points, we get the above approximated function values for each point in the grid on the domain [0,1]x[0,1].