Thursday, April 25, 2013

Debugging Remote Applications With ECLIPSE

DEBUGGING REMOTE APPS WITH ECLIPSE

1.  Must have the Android SDK plugin for Eclipse. Instructions to install it here:

2.  From the DDMS perspective or adb shell, find the port of the process that you want to debug.

3.  From the Debug perspective, go to Run, then Debug Configurations… Select Remote Java Application and click the New Launch Configuration button on the top left corner of the screen.


4.  Create a new configuration. Enter a name for it and change the Port to the corresponding port of the process that will be debugged.


5.  Click on the source tab, then click Add…
    

Select File System Directory and click OK.
              

Browse to the location of android’s framework and make sure that the Search subfolders box is checked. Then click OK. Repeat the process for any other paths that contain source files required for your application.
           

6.  Click Apply and then click Debug to start debugging the process previously selected.

7.  When the debugging starts, the Debug perspective should look similar to this:


After hitting a breakpoint, the debugger will show the file and line where the breakpoint is and the stack trace:


For subsequent debugging sessions of other processes, just go to STEP 3, change the port number to the port number of the new process that you want to debug, click Apply and then Debug, nothing else has to be reconfigured.

Wednesday, April 24, 2013

Embedded panels and its impact on application frame rate and UI fluidity


DISPLAY BASICS [1]
In general, the display subsystem of an embedded system is designed to transfer the data from a block of internal processor memory called a Frame Buffer (FB), which software running on the main Processor updates to change the image being displayed. The display data consists of some number of bits for each pixel of the displayed area. In most cases there are specific bits which describe the intensity of the red, green and blue (RGB) components of each pixel, although other formats such as YUV/YCrCb (luminance, red chrominance, blue chrominance) are occasionally used.
A logic block generally referred to as a Display Controller fetches data from the Frame Buffer, formats it according to the desired display interface, and transmits it to the Display. Figure 1 shows a basic system, where the Display Controller is internal to the Embedded Processor and communicates directly to the Display
 
In many systems the Embedded Processor includes a Display Interface Controller, which creates an intermediate communication structure which is separate from the Display Interface. In this case the Display Controller is external to the Embedded processor, and is often included within the Display itself. This external Display Controller often includes its own Frame Buffer. Systems of this type are shown in Figure 2. Note that it is also possible to connect to an external Display Controller via a standard bus such as PCI.
 

The intensity of each pixel on a display, whether it is a CRT, LCD or other type, generally must be periodically refreshed. The architecture of this function was developed in the days of CRTs, but has remained quite consistent as shown in Figure 3. The refresh is usually done by "raster scanning", which starts at the first pixel (typically the upper left hand corner), generates an intensity for that pixel, and then moves horizontally through all pixels of the first scan line. At that point a "horizontal synchronization" or HSYNC signal occurs, which causes the refresh to move to the beginning of the second line, and so on. This process continues until all lines have been refreshed, at which point a "vertical synchronization" or VSYNC signal occurs. This causes the refresh to return to the first pixel, and the process repeats.

The physical nature of the display generally dictates that the response to HSYNC or VSYNC is not instantaneous, and thus nothing is refreshed for a time after each signal is received. These times are referred to as "blanking periods". They are shown as the dashed lines In Figure 3.
 


Command Mode Panels (Generally used in high end smartphones)
        DSI (Display Serial Interface) in command mode expects the panel to have an internal frame buffer. The job of DSS (Display Sub System) is just to push out a frame to the panel. The panel has a controller inside it which refreshes the screen using the content of the internal buffer; the panel's controller refreshes the screen using its own timings.
        In DSI command mode, you control the rate at which data is pushed out to the panel. So if application generates 50 fps of data. It pushes out data at that rate. However, since the panel's controller is refreshing at 60 Hz by itself, you will see tearing since the internal buffer is filled slower than 60 Hz. 




Video Mode Panels (Generally used in Tablets)
        DSI in video mode requires us to push out data continuously to the panel. The DSS completely controls the panels timings and hence the rate of refresh of the panel. This is needed as the panel has no internal buffer and needs the host to pump out data to it continuously. 
        In the video mode, the DSS driver provides some API which tells an upper layer the right time to present a new frame to DSS, this is the time when a vsync occurs. If you use this API to wait for a vsync, because of its slow rate of pushing out data, the application is never ready with a frame for every alternate vsync from the panel, hence leading to half the frame rate. 



Conclusion
In general command mode panels have better frame rates than video mode panels for browser use cases because of additional internal frame buffer.

References
(1) http://www.kozio.com/view_files/Embedded_Display_Interfaces_WP_final.pdf