User:WilliamStrath/sandbox

1.1 Software Design The primary on-board data handling task for each module of StrathSat-R is to store several camera streams. There are two camera streams on SHIRE which track initial ejection of SAM & FRODO, two on FRODO and three on SAM, all six of which monitor the respective deployable structures after ejection from the SHIRE. Besides this main task, the controllers must also sample from the additional sensors particular to that ejectable module and its payload. A further application for SAM is to apply the controllers to the actuators in the duty cycles.

1.1.1 Development Platform The software will be developed using the mbed NXP LPC 1768 MCU. It utilizes an ARM Cortex M3 and should provide a robust and flexible development unit for use in each module. The MBed microcontroller can be programmed in its own high-level proprietary code format, or in C++ for custom functions. Assembly language can also be used for lower-level functions. The MBed is not powerful enough to process the video feeds required in this project, therefore it is being implemented together with a Xilinx Spartan 3E device. Software for this will be written in VHDL, with the purpose of handling parallel computing tasks.

1.1.2 SHIRE As the housing unit for the ejectable module prior to ejection, SHIRE must accept the signals from REXUS and then perform the actions that affect the ejectable units. This is represented in the following control algorithm for SHIRE ( _Ref317125084). SHIRE will monitor the health and status of SAM and FRODO during the ascent before ejection. Ten seconds before ejection of SAM and FRODO, SHIRE will begin storage of the data feeds from both cameras. This timing overlap is included to ensure that SHIRE records the ejection process. SHIRE will then activate the pyrocutters to eject SAM and FRODO. SHIRE will continue to store images from the dual camera feed for the duration of the experiment. A diagram of the SHIRE system architecture is shown below:

SHIRE monitors its own health signals (battery level and temperature of CPU) as well as storing information on FRODO and SAM health signals. SHIRE responds to the signals from the RXSM in order to correctly schedule the data storage and ejection functions of the experiment.

The method of saving the camera data streams, and other data, to the SD card is common to SHIRE, SAM and FRODO, and is detailed in [X]. 1.1.3 SAM The system architecture of SAM is shown below in _Ref326949897. The visual data storage operation is as described previously, with the MCU providing the control and scheduling signals for each camera, while the FPGA takes care of the camera data output lines. [Note I need to update figure 4 to include some parts of Thomas’ work, but it will roughly be like this. Also, there won’t be the list of ‘Still need to know!’] Analogue signals such as temperature and battery level will use the onboard ADC properties of the MCU. The GPS receiver and transmitter unit will use its own internal controller, interfacing minimally with the MCU [or not at all?].

Figure - System Architecture of SAM

The microcontroller onboard SAM must control the actuation phases as well as the cameras, this is shown in _Ref317126344. The timings for the cycles are shown in the figure, and are also clearly shown in _Ref318296393.

1.1.4 FRODO The system architecture for FRODO is shown in _Ref326950944. The architecture is very similar to that of SAM, except that the larger number of analogue inputs may require multiplexing.

Figure - FRODO System Architecture

The cards will receive data using the SPI interface. The parallel data streams will be converted by the FPGA to allow storage on the SD card.

The MEMSense NANO IMU communicates with the MCU through an I2C interface. A 37 byte sample block is transmitted on the I2C shared bus. The transmission starts in direct succession to the master addressing the slave device, continuing cyclically until the Master signals the I2C stop sequence on the bus. Byte transfers terminate with an ack bit, so the bus generates amortized average of 9/8 bits per data bit. The I2C bus is nominally clocked at 100kHz, a single sample of the IMU would take approximately 3 (start) + 9 (address, r/w) + 37*9 (sample block) + 3 (stop) = 359 bus cycles, for a maximum sample rate of 278s-1. A reasonable maximum sample rate is 250s-1, which is well above the minimum design requirement for a sample rate of 4s-1.

The pressure sensors MPXA6115A are analogue output devices, therefore they will require the analog in (ADC) feature of MBed. (sample rate of MBed, or required sample rate for experiment?).

1.1.5 Data Handling Common to all modules’ operation is a requirement to store multiple camera feeds onto an SD card. An FPGA will be used to perform data handling from the cameras and other sensors. Due to the relatively high throughput needed for multiple cameras, the use of a microcontroller for data storage is non-optimal. The implementation of a FPGA will allow the custom, parallel, high speed logic necessary. The camera module provides 16 bit RGB-565 data for each pixel with a multiplexed 8 bit output. The horizontal and vertical positions are incremented delivering each pixel within the image individually. All control of the camera module is delivered through the I2C interface by the microcontroller. The microcontroller will also supply a data acquisition signal to the FPGA to indicate when to start and stop data acquisition and storage. Further communication between the microcontroller and FPGA will be used for the health monitoring systems, ie. to indicate the correct programming of the FPGA at start up.

Figure - Camera Control and Data Storage System

The data from sensors such as the IMU unit is collected by the microcontroller. There the data is collected into blocks that is sent to the FPGA using a suitable communication protocol such as I2C.

The data flow within the FPGA provides the means to store all information from the sensors into a single SD card, or to parallel redundant SD cards. The data from each sensor is stored in RAM within the device awaiting preparation for storage. The data from each sensor is multiplexed into blocks of data suitable for storage on the SD card. In this way the data from each sensor will be saved to SD card in a round robin manner, ie. camera one, camera two, IMU, pressure, camera one, camera two etc. [POSSIBLY MAKE DIAGRAM FOR THIS] The multiplexed data is stored in a FIFO (First In First Out) register to avoid overflows when transferring data to the SD card. The SD card may be written using either SPI or SDIO protocols. All data stored on the SD card is time stamped using a counter within the FPGA which operates from the first activation of the device to determine the correct timeline.

Figure - Internal FPGA Architecture (Possibly need calculations to back up this design and prove that it is feasible). The data stored on the SD cards will be reconstructed after the experiment is recovered. This frees up the in-flight processors to handle only the relevant real-time data capture and operations. A program to extract and format the data will be produced and tested before the experiment to ensure that the raw data from the SD card can be decoded. The throughput of data has been calculated for all components of the experiment, and it is shown in _Ref318296555. The memory cards for the mission have a capacity of 8 GB and a read/write speed of 95 MB/s, these cards are far in excess of the currently calculated data rate. This will work strongly in our favour to mitigate the chances of write errors.

The video streams are required to record at a minimum of 640x480 pixel resolution with a peak frame rate of 15 frames per second. The cameras intended will return 16-bit RGB565 samples via a byte-parallel interface. The resultant bandwidth requirement (RGB565 sampling) is then 640x480x15x2bytes/second = 9,216,000 bytes/second per camera. The cameras will be operational for the duration of time detailed in the experiment/launch schedule.

A partial loss of pixel data would interrupt the alignment and parsing of the video stream data after the mission and so the storage of the camera streams is considered a hard real-time scheduling problem and a static scheduler implemented in the CPU's assembly language will be used to enable formal verification to ensure that schedule constraints are met without requiring continual testing of modifications on hardware that may not be readily available for testing during software development and schedule fine-tuning.

1.1.6 Implementation With the architecture and desired operation of the experiment expressed in diagrammatic form, including the data handling and input/output characteristics of the system, the next step will be to develop prototype software. Development of the microcontroller software and FPGA design will take place in parallel; as the microcontroller functions are mostly concerned with the scheduling and transducer input/output operation, while the FPGA program is focused on data handling.

The microcontroller will be developed on a test platform, first using appropriate transducers to model simple inputs (ie. Push switch for digital input, potentiometer for analogue input) while using LEDs to show output and timing information. Next, the microcontroller will be programmed to interface using the I2C and SPI protocols as required, and any custom headers (written in C++ or using assembler) for the MBed compiler will be created. This includes controlling the cameras, but not interpreting the data feeds.

The FPGA will be developed with the goal of storing images on the SD cards in mind. Once the FPGA has successfully stored the image data of one camera, and the data has been successfully reconstructed after storage, the same task shall be programmed with two camera feeds. Finally, the sensor information from the MCU will be stored into the SD cards using the FPGA design. At each stage, the data stored on the SD cards shall be analysed, ensuring that it is recoverable. It is anticipated that, owing to some complexities (and unknowns) in the operation of the camera, the architecture of the FPGA may have to be modified to compensate for undesirable data output.

Table : Timeline of Commands for Software Table : Data rates and collection for sensors