Talk:Dependency injection/Archives/2009/June

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Yet another example

I think the example mentioned above is as bad as the one currently in the article. An example explains an abstract idea in a concrete form. Then why on earth make the example abstract? Both examples also use horrible naming schemes. Either the names are nonsense, or the names are waaay to long. Long verbose names are a good thing in general, but it can be pushed over the edge (e.g. ComponentImplementationWithDependencyInjected - I just read blahblahblah).

So here is a better way to put forward an example (in my opinion, I welcome discussion):

Suppose, for example, you are asked to write the software for the next killer application: a blinking LED. Your algorithm will need some external assumptions: you must be able to switch the LED on and off, and you must be able to wait an amount of time.

Following the principle of dependancy injection (using interface injection in this simple example), you define these dependancies as an interface:

void led_on();
void led_off();
void wait_ms(unsigned ms);

This is all it takes to start implementing the main algorithm:

void blinking_led(unsigned period_ms)
{
	unsigned half_period = period_ms / 2;
	while(true)
	{
		led_on();
		wait_ms(half_period);
		led_off();
		wait_ms(half_period);
	}
}

The only thing that remains to be done is to find out how you accomplish the goals defined in the interface. You could access the hardware directly, use a driver and operating system calls, send it over a network using a field bus, or make it interface a GUI, but basically the meme here is "I don't care". Here is an example implementation having nothing to do with LED's:

void led_on()
{
	puts("ON");
}

void led_off()
{
	puts("OFF");
}

void wait_ms(unsigned ms)
{

	while(ms)
	{
		puts(".");
		ms = ms - 1;
	}
}

I hope everyone agrees that if you compare this example to the ones mentioned up till now, this is much more understandable. It is written on purpose to make it as readable as possible (e.g., I could have written while(ms--) but there's no need to), and the concept can be explained in seconds without making your head explode.

Right now, I think both examples are bad by the same reason: they do not really explain DI. They only explain polymorphism. Ramiro Pereira de Magalhães (talk) 05:14, 21 June 2009 (UTC)

Also, the discussion over wether this really is a principle or just another form of higher order functions, lambdas, abstraction etc. is sound in my opinion, but seeing how few people grasp this basic idea, I always welcome emphasis on this way of thinking. So, yes, they are manifestations of the same idea, but worth mentioning nevertheless. Hansvi (talk) 21:45, 20 May 2009 (UTC)