search  | feedback
  thriving
  organisations
  governance
  planning
  evaluation
  quality
  improvement
  evidence
  based practice
  risk management
  writing policy
  & organisational
  manuals
  questionnaires
  data analysis
  social capital
  facilitation
  training workshops
  ideas & articles
  about us
  disclaimer | copyright

  Management
  Alternatives Pty Ltd
  ABN 23 050 334 435

 


Ideas
Back to ideas contents


Spacer

  

Evaluating Human Services
Complexity - Uncertainty - Self-delusion - Rigour


Paul Bullen

Contents

1. Introduction

2. Human Services are Complex

3. Three Useful Strands in Evaluation Processes

4. A Cautionary Note - Transferring evaluation tools from Manufacturing to Human Services

This paper was first published in the June 1996 edition of Evaluation News and Comment The Magazine of the Australasian Evaluation Society.



1. Introduction


Evaluating human services is not a simple task. It is complex. There are many uncertainties. Different people can have different views about the same events. People don't always tell the truth. The consequences of human services are usually hard to measure, count or pin down. There is plenty of room for self-delusion.

So high quality evaluation processes are essential in human services.

Some of the small community organisations (many of which employ less than 10 staff) I have worked with on evaluation processes are:

  • Family support services
  • Youth refuges
  • Supported accommodation for people with acquired brain injury
  • Community options services (part of Home and Community Care)
  • Dementia support services
  • Family Day Care

These are human services. They are intended to bring about changes in people, for example:

  • Parents will parent better;
  • Young people will have better independent living skills;
  • Children will socially and emotionally develop appropriate to their age and individuality;
  • People with acquired brain injury will live in their own home, participate in their local community and be a valuable member of society.

These services all receive government funding through specific programs, for example: SAAP (Supported Accommodation Assistance Program), HACC (Home and community Care), CSGP(community Services Grants Program), CSP (Children's Services Program) .

In this article I make three main points:

Firstly, human services are complex and there is plenty of uncertainty in evaluating them. Five observations are:

1. The services any one community organisation provides are only one small part of people's dynamic, open-ended, complex lives.

2. Human services are often made up of many elements with complex relationships and dynamics between them.

3. There can be as many different views about the same event as the people you ask.

4. People don't always tell the truth.

5. Stories of real human services disasters come out years later.

Secondly, three useful strands in evaluation processes in human services in community organisations are: collaborative reflection, being grounded in people's experiences and stories and collecting and using lots of facts and figures to help ask good questions.

Thirdly, I add a cautionary note on transferring management and evaluation tools from manufacturing to human services - a common practice in recent years. Human services have some different working assumptions to manufacturing. If manufacturing tools are transferred into human services they may need to be modified if they are to work in the new setting.





2. Human Services are complex and there is plenty of Uncertainty in evaluating them.


1. The services any one community organisation provides are only one small part of people's dynamic, open-ended, complex lives. Yet services are often represented by simple - simplistic - models.

Models like the following are common:

This model emphasises:

  • the importance of the service in relation to the client
  • the simplicity of the service delivery process

This model ignores:

  • outside influences
  • interactions between the elements in the service process

The following diagram provides a different emphasis. The service is one element in a complex web of relationships with the client.

The part the service plays can be quite small compared with the person and all their other relationships and interactions. Even this more complex diagram does not show the changes over time.

See Animations for more dynamic models.

It may be as important in evaluation processes to be looking at the web of relationships as it is to look at the service itself.

Are the models of service:

  • Making the service out to be more important than it is?
  • Leaving out the outside influences?
  • Leaving out the interactions between the outside influences?
  • Ignoring the time delays between the interactions?
  • Ignoring the web of human relationships in which people find themselves?

Is the model such an oversimplification that it is no longer useful?

2. Human services are often made up of many elements with complex relationships and dynamics between them.

In the first diagram above the service process was conceptualised as a simple series of steps:

  • Referral
  • Assessment
  • Service Provision
  • Completion

It is assumed that there are cause and effect links through the service processes.

As the following two examples will show in many human services it is difficult to identify in any real way the complex interrelationships between all the elements of the service processes.

It is effectively impossible in many human services to chart cause and effect relationships (in anything more than the most simplistic manner).

For example, family support services, which aim to assist families to cope with the challenges of family life and bringing up children so as to prevent family breakdown provide an integrated mix of services to families in crisis or under stress. The mix can include:

  • A family worker working with the family in their own home,
  • The parents attending groups (for example on parenting),
  • Child care (while attending groups),
  • Play groups with other parents and their children,
  • Referral to other services.

The overall success of the service for a particular client can depend on the effective integration of all these elements to achieve the client's goals. There is no simple chain of cause and effect. And the goals to be achieved through these services can vary from one client to another. A study of family support services in NSW found clients each often work on two or three major goals. And there are more than 25 distinct goals that clients typically work on with family support services.

  • To improve self esteem/confidence
  • To improve relationship with partner
  • To improve relationship with children
  • To improve relationships with extended family members
  • To reduce/deal with domestic violence
  • To separate from/divorce partner
  • To improve parenting skills
  • To reduce my social isolation/ improve social contacts/networks
  • To improve home management skills
  • To obtain child care
  • To obtain respite care
  • To arrange substitute care
  • To learn budgeting and financial skills
  • To get further education/training (non-literacy)
  • To improve my English/literacy skills
  • To find work
  • To get better housing(not public housing)
  • To get public housing
  • To get legal advice/ action re custody of children
  • To get other legal advice/ action/ support (non custody issues)
  • To maintain/ improve family health
  • To work on drug and alcohol issues
  • To deal with Dept Social Security
  • To obtain transportation
  • To obtain material assistance
  • Other

When the integrated mix of services noted above is combined with 25 different goals there is enormous variety in the service processes and steps that are actually used with clients.

A second example shows similar complexity. Jannawi Family Centre is a specialist child protection service that aims to prevent abuse, neglect and family breakdown by providing a child protection intervention service to "at risk", abused and neglected children aged 0 to 9 and their families. Their program includes:

  • Counselling,
  • Groups for parents,
  • Groups for the children,
  • Other support services such as occasional care, toy library and children's health clinic,
  • Connections with other community agencies such as family support, play groups, couple counselling, sexual assault counselling, and
  • Case review planning and coordination that includes program review meetings, team case planning, staff supervision and case conferences.

All of these elements are highly integrated - that is one of the keys to the success of the service. A simple visual representation of the five stage multi-strand program is:



Family Centre Program



Human Services are a small part in people's complex dynamic lives. The services are also made up of many elements in complex relationships with each other. If we oversimplify will we be able to ask the right evaluation questions? If we oversimplify will we be believed?

3. There can be as many different views about the same event as the people you ask.

In many evaluations different groups of people are asked the same questions. For example, in an evaluation of a family day care scheme parents and carers were both asked whether the carers are paid on time. It is common for the different groups (e.g. staff, clients, family member and significant others) to make different judgements about the same thing. Here are some examples:

In an evaluation of a Catholic Boarding School where 73 parents (of 102 boarders) and 114 boarders completed questionnaires some of the findings were:

Why is the boarding school important to you? 66% of parents ranked Being in a Catholic educational environment as number 1 compared with 6% of students. 67% of students ranked having a wider choice of subjects or reduced travel time as number 1 compared with 24% of parents.

4% of parents felt that their daughter rarely or never feels at home in the Boarding School; 39% of boarders said that they rarely or never feel at home in the boarding school.

In an evaluation of 24 Migrant Access Projects for each project a project staff member, a funding body representative and an independent evaluator were each asked to rate their project on 9 different criteria. When the responses of the three evaluators were compared on each criteria separately then for any one criteria in any one project there is a:

  • 14% chance of complete agreement amongst the evaluators (e.g. unsuccessful & unsuccessful)
  • 37% chance of substantial agreement amongst evaluators (e.g. successful & very successful)
  • 49% chance of significant disagreement amongst evaluators (e.g successful & not successful).

In an evaluation of a Family Day Care Scheme 72 parents and 26 carers answered a similar series of questions about relationships with each other. Some of the findings were:

  • 23% of carers don't agree they are usually paid on time at the same time 97% of parents say they pay on time.
  • 15% of parents think that they sometimes take advantage of their carers whereas 42% of carers think that parents take advantage of them;
  • 9% of parents agree that my carers and I treat each other too much as friends and not enough as parent and carer; 27% of carers think that parents treat me too much as a friend and not enough as a carer.

These discrepancies were found to occur even when the parents and their actual carers responses were matched.

In an evaluation of a supported accommodation service for people with acquired brain injury the residents, their family and friends and the staff working with the residents were all asked to rate the quality of the services in terms of what was being achieved for residents on 16 different criteria. The following chart shows for one resident the responses of six people (the resident, two family/friends, and three staff) on the 16 criteria. The lower the columns the more positive the response. The higher the columns the less positive the response.

The six people have different views. The client strongly agrees with thirteen out of the sixteen statements in the questionnaires. The two family/friends see the situation as less positive. Staff vary considerably in their responses. One staff person is very positive in most of the sixteen areas another sees four of the 16 areas as negative.

Charts were made for all clients in the service. They all showed wide differences in views between different people.

One Client Six Viewpoints

The evidence suggests that it is not possible to assume that different groups of people have the same views on the same issues. So in your evaluation process who are you talking to? Who could/should you be talking to? What will you do when you hear totally different views about the same thing? Collaborative reflection is a useful strategy - a group of people coming together to reflect on what the divergent views means. Getting the different groups to talk to each other is also very useful.

4. People don't always tell the truth.

Many evaluation processes assume people will tell the truth if asked. There are many reasons why they don't.

  • When services are evaluated clients can feel that if the service doesn't get a good mark the funding might be cut off - so they say only good things.
  • Clients do not feel comfortable criticising the staff providing the service for fear that they may not continue to get the service or there will be some form of retribution.
  • Staff do not wish to acknowledge publicly that they operate outside the funding guidelines (in the interests of the clients).
  • Program managers and administrators do not want to be told that 'such and such' is happening because if they knew they would have to do something about it - so no one tells them.
  • Staff do not want to acknowledge mistakes for fear of 'punishment' in an organisation that has a culture of "people being punished for mistakes' (rather than mistakes being seen as opportunities for improvement).
  • People involved in the service believe that what they are doing is good - so they don't put sufficient weight on the not so good parts.
  • A client (or staff member) may have such a bad experience that in the short term the emotional price of talking about the experience with others is too high and so nothing is said.
  • Group think - people do not allow themselves to think outside the group norms, beliefs, etc and take an alternative point of view. Dissent is not allowed.
  • People want to hold onto power and resources. In extreme cases, people are corrupt and deliberately lie and mislead in order to keep power and resources.

Not telling the truth can range from failing to talk about the real issues to deliberately lying and misleading.

In one religious organisation a survey of about 60 religious sisters working in ministry found:

  • 97% rated the efficiency of their work as average, good, or excellent
  • 100% thought the extent to which the work adequately met the needs of the people being served was average, good or excellent
  • 100% rated the appropriateness of the work in terms of fundamental values and beliefs as average, good or excellent.

In my experience with other organisations these results were "too good". They were not believable. Had everyone told the truth?

When these 60 people were brought together into one room and asked: What are the major issues that must be addressed in relation to the works undertaken by the sisters? large numbers of participants asked serious questions about the work being undertaken by others or made statements such as: we are not sure that such and such a ministry should continue. People in the ministries in question also acknowledged they had serious questions and reservations about the value and future of various ministries. People had not 'told the truth' in the questionnaires.

While there are some strategies for dealing with the reality that people do not always tell the truth sometimes evaluators do not have the powers/tools to find out what they need to know. A Royal Commission might be required. For example the current NSW Wood Royal Commission into police corruption is essentially an evaluation process with special powers and resources.

There have been many occasions at the Royal Commission when witnesses have given evidence which they have subsequently changed after they have heard and/or seen other evidence such as auto tapes of conversations or video tapes of meetings.

In the evaluation processes you are using are you assuming people will tell the truth? There is plenty of evidence to suggest that everyone doesn't tell the truth all the time. How will this affect your evaluation processes and the interpretation of the results?

5. Stories of real human services disasters come out years later.

There are many examples of disasters in human services slowly becoming public years after the event.

Chelmsford Hospital is one example. Dr Bailey ran the now-infamous "deep-sleep" unit which caused the deaths of at least 24 people between 1964 and 1979. Deep-sleep treatment ended in 1979, and Chelmsford is now under new management.

Deep sleep therapy was given for more than 10 years and it took more than 20 years for a 'public evaluation' of what had happened.

1950s - Psychologist Dr Evan Davies begins his professional relationship with psychiatrist Dr Harry Bailey, the director of the Cerebral Surgery Research Unit at Callan Park Psychiatric Hospital in Sydney.
1963 - Bailey begins sending patients to Chelmsford Private Hospital, where their treatment includes being put into drug-induced comas.
1979 - Deep-sleep therapy ends after complaints about patients' deaths are lodged with the Health Department.
1988 - Royal commission into deep-sleep therapy is appointed after the Herald reveals details of a confidential Health Department investigation into 24 deaths at Chelmsford.
1990 - The commission releases a damning report on deep-sleep therapy.
Sydney Morning Herald, Page 11, Tuesday, 4 July 1995

What evaluation processes should have or could have picked up the issues at the time?

Another example is the sexual abuse of children by their teachers in schools. Stories are now coming out about events over the past 60 years.

More than 100 Christian Brothers in Australia who have been accused of abuse in existing or threatened legal proceedings during the past eight years. Since 1992, four brothers have been found guilty and a further five charged in sexual misconduct cases brought before Australian courts. In the civil courts, 201 men who were once orphans under the care of the Christian Brothers in Western Australia want to file a massive compensation claim for physical and sexual abuse they allegedly suffered between 1938 and 1976; according to their lawyers, the men have accused 93 brothers, at least 20 of whom are alive.
These snowballing allegations have all but destroyed the reputation of a religious order that educated some of the most prominent Australians of the past half-century. It has unfairly tainted the many decent brothers, past and present, who have worked in Australian schools. But the deeper question, one that has only recently surfaced, is how culpable were senior executives of the Christian Brothers in this calamity?
Sydney Morning Herald, Page 1, Saturday, 22 July 1995

Why were these human service disasters not picked up at the time?

How do you know that you do not have another Chelmsford?

If so could it be identified using the evaluation tools and strategies that are usually used?

How?

Are we any better at it now than 10 years ago or are the dynamics that kept past human service disasters hidden still at work today?



Discussion Starters

Evaluating Human Services

  • There is an openness in being human that demands uncertainty.
  • Human services are a complex web of personal interactions.
  • There is plenty of room for self-delusion.
  • Rigorous evaluation is essential in human services (more essential than in areas where there are more certainties).
  • Programs are myths to make it easier for managers and administrators.
  • Manufacturing and human services are fundamentally different.
  • Looking at the web of relationships may be more important than looking at the service
  • Performance indicators that judge human services are a theoretical impossibility.
  • When things seem very good they may be really very bad.


3. Three Useful Strands in Evaluation Processes

They are:

  • collaborative reflection;
  • being grounded in people's experiences and stories;
  • using lots of facts and figures to help ask good questions

If we take seriously that:

  • Any particular human service is a small part of peoples complex, dynamic lives;
  • Human services themselves are complex open systems;
  • There can be many views about the same situation;
  • People don't always tell the truth;
  • Measuring and counting the consequences of human services is very difficult;

then evaluation processes that are able to deal with this reality inevitably require a group of people to collaboratively reflect and dialogue.

Evaluations where an external evaluator does an evaluation of a service and makes a definitive judgement of the value of the service are not appropriate. The role of the evaluator is to facilitate a process of dialogue and reflection that is grounded in people experiences (e.g. clients) and other relevant facts and figures.

In working with community organisations in evaluation processes I often find the following three strands useful:

  • (a) A group of stakeholders (e.g. a steering committee, task force, working party) working collaboratively through a process over five or six meetings where they work on:
What is the purpose of the evaluation process?
What are the questions we want to ask?
What data is required?
How can we get it?
What does the data mean?
So what? What are we going to do now?
  • (b) Parallel with the steering committee or task force there is a process of hearing first hand from stakeholder groups, for example through interviews with clients and staff, or focus groups with clients or other service providers.
  • (c) There is also a strand of" hard data" collection. Counting the numbers of clients, client characteristics, working out ratios such as dollars per hour of direct service or staff hours per hour of direct service. Collecting data from client data bases or information systems, or undertaking surveys of clients or staff.

These three strands come together as part of an integrated whole. Each influences the other in a dynamic interaction over time.



Evaluation Processes

Clients Experiences and Stories

It is not possible to understand human services without clients' stories. Compare the following two descriptions for family support services.

Quoting from the state wide data collection:

In 1994 there were 134 Family Support Projects being run by 129 organisations. These Family Support Projects:
* serviced approximately 12,700 family worker clients during 1993;
* in Census week 1994 serviced approximately 4,200 family worker clients(ie the number of family worker clients receiving service at any one time);
* in Census week 1994 worked with approximately 1270 families who have children who are known to have been notified to the Department of Community Services as being at risk; * provided groups for approximately 3,300 participants during Census week;
* ran approximately 10,850 group sessions during 1993/4.
The Projects employ approximately 690 staff mostly on a part-time basis. This is the equivalent of approximately 380 full-time equivalent positions (based on a 35 hour week).

Listening to clients tell their stories:

I tried everything with my husband to make things work. He is a drug addict. I don't want to live with a drug addict. All my life I have been told what to do and for once I have come to a place that allows me to work out my problems.
Its scary, I have taken responsibility for my life. This has been a complete turn around in my life and such a learning experience. I am going to keep going.
It has taught me to relate to my feelings; at one stage I just blocked them out but as I got my strength back and sorted out who I was I can relate with my feelings. When I come here I let the barriers down because I feel secure and safe.
It is as if I have never had supportive parents and I am getting it here. I can now take responsibility for my life.

Another Clients story:

Now I am working as a bookkeeper. One of the women from the management Committee suggested I apply. I was unemployed at the time. Now I am using my brain again. It is a break for me and it is getting me back into the workforce. It is easing me back into working again. I am treating it as a proper job. I'm getting used to using my brain again.
The clients stories are insightful and compelling. They are essential for understanding the service. But the stories alone are not enough. We also need the facts and figures, the answers to how many clients? for how long? how many hours of service? and so on.

Facts and Figures

There is a common attitude in small community organisations that funding bodies expect numbers (performance indicators, statistics or other figures or "hard data") to judge the service. There may be truth in this - ie, that funding bodies do expect numbers to judge the service.

The attitude that numbers judge us may have its roots in our experiences at school of getting marks out of 10, or out of 100, from our teachers for assignments and tests. It may have its roots in a nineteenth century scientific world view, the roots might even be in economic rationalist thinking.

Wherever the roots of the attitude come from, in human services it is not possible for numbers to be the judge. People make judgements. Numbers don't.

Numbers, and lots of numbers, are needed, not to be the judge, but to help us ask good questions; to help us find the right questions.

In the public arena there is always debate about whether unemployment figures, balance of payment figures or government budget deficits are indicators of good times or bad times.

Each publicly released figure usually receives at least two totally different interpretations.

This is also true in human services. A useful exercise with any set of facts and figures for human services is to put them in front of a group of staff or management committee members and simply ask: are the figures good, or not good. Usually there are different points of view. In the ensuing discussion the important issues often emerge.

The following facts and figures are from a study of 112 family day care schemes in NSW. Family day care coordinators were asked to rate aspects of their scheme out of 10 (0 = terrible; 5 = passable; 10 = excellent).

100% of family day care coordinators in NSW rate the quality of care for children 6 or more out of 10. Is this good or not good?
It is too good - people are deluding themselves
It is very good - I trust the coordinators judgement and know that we have put a lot of effort into improving the quality of care in recent years.
It is not good - we should have 100% of coordinators rate quality of care at 8 or more out of 10.
20% of family day care coordinators in NSW rate their relationship with the Department of Community Services between 0 and 5 out of 10. 80% rate it 6 or more out of 10. Is this good or not good?
Its terrible - we can't afford any bad relationships.
Its OK - given the changes within the department in recent years I'm surprised its not 40% with a rating of 5 or less.

In your evaluation processes:

  • Where are the opportunities for people to collaboratively reflect and dialogue?
  • How will clients' experiences be heard first hand? With all the real life complexities?
  • Do you have enough facts and figures to find the right questions to ask?




4. A Cautionary Note - Transferring Evaluation Tools from Manufacturing to Human Services


Transferring management and evaluation tools designed for manufacturing into human services needs to be done with caution.

Human services are intended to bring about changes in people. The person is the 'product'. Without wanting to oversimplify either human services or manufacturing some of the key differences between human services and manufacturing are:

  • Clients receive individualised rather than standardised services (couple counselling is tailored to the couple; the services people get will vary with their individual needs and issues to be worked on; if I buy a set of 6 kitchen chairs you can buy another essentially identical set).
  • Service processes are loosely defined (the actual steps in domestic violence counselling are not as well defined as the steps in the manufacturing and sale of computer memory chips).
  • Human services are often provided in one to one relationships or in groups where people interact with each other. (Milk bottles on a production line don't talk to each other or talk back to the machine operators).
  • People make choices (television sets don't).
  • The outcome is harder to measure. (Self-esteem of an adolescent is more difficult to measure than whether or not a person purchases a light bulb that works.)
  • It is difficult to show cause and effect relationships (when a person in a job placement program does not get a job it does not necessarily mean the job placement program is not good - there are many alternative explanations; if a chair comes off the end of a manufacturing production line without legs it is almost certain that a person or piece of machinery on the manufacturing line fouled up).

These differences between manufacturing and human services mean that when transferring management or evaluation tools from manufacturing to human services it is important to ask: What assumptions is the management or evaluation tool based on? Do those assumptions hold true in human services?

For example:

  • Quality control
  • Performance indicators
  • Total Quality Management and
  • Benchmarking

all have their origins in the manufacturing sector. The all assume :

  • cause and effect relationships can be identified;
  • things are easily counted and measured.
  • standardisation of processes that are well defined.

In many human services these assumptions do not hold.

These management and evaluation tools have worked well in manufacturing and have been used to bring major improvements to the sector. Total Quality Management, for example, is given major credit for the change in the quality of Japanese products from the 1950s (shoddy, cheap, nasty) to the 1990s (state of the art, quality, value for money).

In human services the story can be different. A process I often see is:

a) An in principle decision is made to introduce the new management tool (often because of a political decision or the latest best-seller).

b) Attempts are made to introduce it (in the same way it was working in manufacturing) . A lot of time an energy is expended, often with little result and very frustrated workers who either feel that if only they knew more they would be able to get it right or who feel angry that such a patently stupid request could be made.

c) After some time acknowledgements are made that in human services it is not possible to use the tool in exactly the way that was first thought and/or intended.

d)But the new management tool still has to be in the organisations repertoire because of a political imperative or the need to save face.

An example of this process was the introduction of performance indicators in the NSW Department of Community Services Community Services Grants Program.

In 1989 the NSW Department of Community Services stated that all programs funded through the Community Services Grants Program had to have performance indicators to measure outcomes.

After three years work by funded services and the Department for little effective results the Deputy Director General of the Department said on 31st March 1992:

"The Department is making clear to Government and making clear in its own services, as well as those that are funded, that performance indicators will not, cannot and should not be the focus for measuring outcomes.
"We are stepping away from using performance indicators as measures of outcomes to collecting indicators of process and output.

In 1996 there is still an expectation that indicators will measure outcomes - but as far as I am aware no service has been defunded because the numbers looked bad - so they are not used to really measure outcomes.

When introducing management and evaluation tools into human services from other settings some questions are:

What are the assumptions that the tool is based on?
Do these assumptions apply in human services?
If they don't apply what modifications or changes need to be made to the tool? To the way it is used?

Conclusions

In human services there are inevitably high levels of uncertainly about the effects of services and how they are achieved. This is not a problem but as things should be. There is an openness in being human that demands uncertainty.

The uncertainty allows plenty of room: for poor quality service processes and outcomes; and self-delusion.

Because there are higher levels of uncertainty and plenty of room for self-delusion evaluation in human services needs to be more thorough, more rigorous and of a higher quality than in areas where there are more certainties. We need to work together to ensure that the worst doesn't happen. And that the best does.

Some useful strands to have in evaluation processes in human services are: collaborative reflection; being grounded in people's experiences and stories; using lots of facts and figures to help find the right questions to ask.

Within the current cultural and political climate in Australia there are major challenges to be faced:

  • For clients to tell their stories and be heard.
  • For human service providers to taking political action to ensure that the underlying realities of human services are understood and taken seriously in funded programs' design, management and administration.
  • For funding bodies to be grounded in the realities of human services (complex services in complex open-ended lives) being provided through programs not just the program description, guidelines and policy manuals.




References

Australian Council of Social Service (1992). Report of the Evaluation of the Migrant Access Project Scheme. Department of Immigration, Local Government and Ethnic Affairs, Canberra.

Bullen, Paul (1993). Mt Erin boarding School Review - Mt Erin Boarding School A Framework for the Future. Presentation Sisters, Wagga Wagga.

Bullen, Paul (1994). Sydney Family Day Care Review. Uniting Church Board for Social Responsibility, Sydney.

Bullen, Paul. (1994). Wareemba Community Living - Service Quality and Directions - Supported Accommodation for people with Acquired Brain Injury. Wareemba Community Living, Sydney.

Bullen, Paul (1995). Working with Families at Risk - A Second Exploratory Paper, Family Support Services Association of NSW, Sydney.

Bullen, Paul (1996). Exploring Family Day Care in NSW, The NSW Family Day Care Association, Richmond (NSW).

Bullen, Paul & Robinson, Collin (1994). Family support Services in NSW, Family Support Services Association of NSW, Sydney.





Paul Bullen

Paul Bullen runs Management Alternatives Pty Ltd (established 1988), a management consultancy business specialising in work for the community based, non-profit, welfare and church sectors and Government.

In recent years he has worked with organisations in these sectors on organisational review; planning and evaluation; facilitation; research, data collection and analysis; and staff training and development.

He has 20 years work experience in these sectors. He has previously been Executive Director of the Home Care Service of NSW (1985-88), Assistant Secretary of the Catholic Commission for Justice and Peace (1982-85) and a Research and Resource Analyst for (the then) Department of Youth and Community Services (1980-82).

He has an academic background in social science (social work), philosophy and theology. He has specialised in research and evaluation.