HomeVideos

Complexity and Decision Making in the Wild - Margaret Hinrichs

Now Playing

Complexity and Decision Making in the Wild - Margaret Hinrichs

Transcript

142 segments

29:31

Activity driving why we're doing this.

29:35

And then lastly, kind of the longest running one that I've been a part of again is the achieve 60 model.

29:43

And in terms of this timeline. This is an example where we thought, when we set out we were going to create kind of a student centric model and be able to trace students through the pipeline of Arizona's education system and say okay, based on these factors,

30:00

can we predict your success in the pipeline Can we try and identify identify points at which you might fall out of the pipeline, what kinds of strategies or solutions can we have in place to make sure you stay in and ensure your success.

30:15

And once we got into the actual process of trying to put this together we realized that student level education data in Arizona was going to be very very difficult to come by and the kinds of trust and relationships and investment and the time and effort

30:32

and energy to get that data was likely and realistically out of the scope of our current project. It wasn't like a little tin project where we we really had that trust and relationship through one of our faculty collaborators already in place.

30:48

This we were kind of starting with a blank slate.

30:51

And so we had to reevaluate with our collaborators, what kinds of questions that we want to answer with this model what was useful to the Board of Regents, for example.

31:01

And so over several months of interactions with people in the education community we realized there were two main things kind of conversations or questions, driving the creation or use of this model.

31:17

The first was which 60%.

31:20

If we're focusing on 60% adults with secondary attainment, doesn't matter which 60 deep focus on.

31:28

And the very clear answers that became Yes, yes, we need to think about do we do a strategic 6060 do we do it across the board 16 What does it mean to focus on this 60% versus a different 60%.

31:45

And then the other questions that that kind of goes along with that as well what about the other 40% are we, what happens when we get to 60 do we just stop what happens to the other 40 and how, how can we think, from an economic and business development

31:58

standpoint of not just treating post secondary attainment, as just kind of a foregone conclusion just for the sake of getting it but what does that then do to the Arizona economy, what does that then do to occupations and business development.

32:14

And so those kind of became the driving streams of research for that for that model.

32:20

And again, this is a learning process there was a lot of iteration on this process it took a really long time to even understand that that was the landscape we were dealing with.

32:30

And just to get there took months, if not the better part of a year until we really knew what the questions we were going to be able to credibly attend to with the data that was available to us work.

32:43

And this can be frustrating for developers and was frustrating to many of the developers that we work with because give things kept shifting things kept changing.

32:54

And this is kind of part of this normal process we know now of going back to the community and kind of doing credibility checks and trying to figure out, are we on the right track.

33:03

Is this what we need to be asking, is this useful to you is this in service to you. And is this something that you could see yourself kind of picking up and taking.

33:12

And so the use of data visualizations and data modeling and this process has really been a core of a lot of the projects and approaches that we've taken in terms of knowledge convergence and really leveraging their power as boundary objects to reveal

33:28

explicit reveal implicit biases and make explicit the knowledge that people kind of carry around in them, but they don't know it's there until there's kind of something on the screen that they can point out, or try out or test.

33:45

One of the last place that I'll talk about is managing the inclusion process to optimize stakeholder commitment and satisfaction. And so you've identified your stakeholders, you conducted your kind of personal inventory of your project readiness.

34:00

But how do you begin to manage the flow of collaborators. And so there's a couple kind of procedural pointers here and again the playbook has a whole section on this, if you're interested in this, I recommend that you go check that out.

34:14

But thinking about prioritizing who and in what order to include people creating ways for partners to demonstrate their commitments the project so even making like small milestones are like key points where people can show just basically have something

34:30

to show for their for their involvement, creating feedback loops to incentivize further collaboration and encouraging critical self reflection of the collaboration process.

34:40

And so this is particularly important in voluntary environments.

34:44

And we see a lot of this in community collaboration, they are there they are volunteering to be there, they are putting you know extra time and energy into working with researchers at ASU, they're not always required to do this as part of their job and

35:00

so how can we attend that and really make sure that their time and support and attention are really going where they can have the best benefit.

35:10

And so two projects in particular that kind of illustrates differences and approaches to managing this includes your process, are, are the education projects that I mentioned as well as another one on continuity of care for people with serious mental

35:24

This was funded through the Robert Wood Johnson Foundation and was also a project that Dr. Katie pine and I worked with several faculty from the downtown campus, and public health, and the citation is below if you want to dive into this a little bit more

35:38

little bit more detail of the publication really goes into the more of the background of the project.

35:44

But in terms of managing the inclusion process.

35:47

I would say, with the education project we really took community organizations one by one. And part of that was because education is such a hot topic in Arizona, that we were trying.

36:03

We were trying to just collect all of the different stories in the beginning. And if we had too many different voices in the room.

36:11

We didn't want it to just become, you know, a debate of sorts, we really wanted to just hear from the individual community members and one on one first.

36:18

And so that was how we ran a lot of our introductory workshops for the achieve 60 projects.

36:24

And so you can see we did.

36:27

We spoke with over 200 people.

36:30

And so these different.

36:34

The role of researcher was also different. And these two different projects have a level of citizen participation was different. I don't like the word staycation but that was what fits the most from the article that I was pulling from to try and categorize

36:48

these different things. And then in terms of who owns the final tool. So there were a lot of differences in these two projects for the continuity of care project that that project was specifically designed to have multiple representatives from different

37:07

social sectors in the room dissecting, a data visualization similar to the global learning metric projects that I talked about before. We had people from the medical claims via people from other sheriff's office and probation we have people from the crisis

37:24

Response Network.

37:25

We had people from different health care agencies via empties in the room. So a lot of people from different sectors which kind of touched on people with serious mental illness as they moved through the system.

37:39

We're in the room to talk about data that was on the screen, in terms of access claims and rates of things like serious mental illness and and other health data that we were able to put together.

37:52

And so we had a lot of people.

37:55

So you had people from the sheriff's office, interfacing with people from public health interfacing with people from the insurance companies and starting to have these conversations, because they all, they all touch the patients basically at different

38:08

points in the patient's journey.

38:12

But a lot of these people had never met their counterparts in the other organizations. And so, from an inner personal perspective, those workshops, really served to create new kind of connective tissue between those different those different social sectors,

38:28

and I think some of them. There was a new grant proposal that kind of grew out of that that to people who had been attending several workshops, decided to work together on it, and we did a series of qualitative interviews.

38:41

At the end of those workshops to kind of do do a, an assessment of what do people think of this process what did they get the most out of it, and over and over what we heard was the relationships that they made with the other people in the room were a

38:55

really big takeaway for a lot of people that they hadn't necessarily expected they thought they were, you know, coming to look at data and give their viewpoints on on grass that they were seeing on the screen and trying to inform how we were looking at

39:09

different serious mental illness claim. But what they really found was, with new collaborations and new connections with other people in the room.

39:17

And so again, that idea of having a requirements of this process to be physically present to be to be iterative to be coming back to the things same people to be having these conversations and keep evolving over time, being able to have that protected

39:32

time and that protected space as part of this process has been a huge deal for us and for the success of these projects.

39:41

So if you're curious about more of the comparisons between these two projects, I recommend you go check out the citation.

39:48

The last play. I'll briefly touch on is leading to a productive conclusion and so we pull a lot from this is an article that Eric Johnson has co authored on the useful life of network governance and there's a lot of conceptual overlap here in terms of

40:06

how do we know when it's time for collaboration to end. And what does it look like to do that gracefully. Because sometimes collaborations reach their natural conclusion and they either kind of needs to come to an end or perhaps they transform into something

40:22

else. But how can we manage that, how do we manage that process how do we make sure relationships are being kind of protected and maintained in a way that that can still be to constructive collaboration later on, because one thing we find when we start

40:38

to work with communities, is that we are unaware that that community has perhaps already had a rich collaboration with a faculty member at ASU or somewhere else and they are kind of left with this bad taste in their mouth of well once the researcher common

40:54

thing gets what they need, they kind of just leave and then I'm here with all the problems I had before. And so that's something we're actively trying to work against this to attend to that history to be aware of that history and try and make sure that

41:09

all of our collaborations are mutually beneficial that we're really helping each other and it's not one sided that someone's not just coming in and taking what they want, and then just kind of meeting the community worse off than they found it.

41:26

So these are these are just kind of a sample of a couple of the different plays and some of the stories behind them, that are in the knowledge exchange playbooks build resilience is on his website.

41:40

If any of what I've said today is of interest to you I encourage you to go check it out download it. See if something is helpful and again, if there's something you think we've missed if there's a place that you think should be added, if there's a tool

41:53

that you've used in your community engagement that's been really successful for you.

41:58

We would love to add it. We, this is a this is a living document it is absolutely not meant to be the end all be all of this process but we really want, even just the process of creating this playbook to be to be constructed and to build a network around

42:12

people that are interested in community engaged research and want to do it in a just equitable diverse and inclusive way, and really are doing it with the community at the heart.

42:23

At the heart of it and really driving a lot of the questions that we're asking.

42:28

And so that is. That's the end of my presentation portion I'm happy to take questions and Trish before I forget, I will put up the slide that shows the upcoming chocolates like shade is up next.

42:42

With interdependence networks, but, but thank you very much for letting me share all of these stories and lessons learned from the field. We've learned a lot in the past couple of years doing this, we've made some mistakes that we're hoping to, to help

42:58

other people avoid in the future but for the most part they have been valuable mistakes and that we've learned a lot, and we're still learning and we're still figuring this out.

43:07

So thank you.

43:26

Thanks.

43:32

God, I was gonna say with if there are any questions or questions or thoughts or comments or additional stories or if this made you think of something that also happened to you.

43:41

I'm happy to hear.

43:50

You have a question in the chat

43:54

are all boundary objects data.

43:59

Kind of a philosophical question I guess it depends on your definition of data. So if you go into the actual boundary objects literature. There are a couple different types of boundary objects so things like things like max things like actual databases,

44:17

really anything that can serve as like a plastic sense making mechanism.

44:24

And so a lot of, in my experience in a lot of the work that I've done data.

44:28

Kind of like data visualizations in particular you know graphs and models.

44:33

But I mean I guess it is a map data, like, to someone Yes, that's the data.

44:41

In my worldview, you know, a hammer sees everything is a nail in my worldview I probably see most boundary objects as data, but, but I'm sure you could see it another way.

44:56

And you are absolutely so the, so I've gotten a couple of messages, the playbook is freely shareable to to whoever you would like to share it with whoever you think might be interested it's freely downloadable from the website, and you are welcome to

45:14

download and share it.

45:21

But it's a range of data worked with, is it all quantitative, so no it is not all quantitative quantitative data, by virtue of how it is, it's kind of easier to to class and graph.

45:36

But we have. So other researchers at decision theater have done and I think even Michael Sunyani who's i think is on the call had done textual analysis SWOT analysis of textual data and other kinds of qualitative data that you could also work with and

46:01

knowledge.

46:04

It looks like my goal has his hand up. So, see if he has a question.

46:09

Yeah, thank you and thank you for the presentation, I actually some of the.

46:15

Do not do is that you were mentioning, I could feel myself doing in real time, maybe even in that photo that was in your slideshow.

46:22

So I really appreciate that perspective, i.

46:26

So, I'm interested in in your emphasis on boundary objects and data visualizations and maps, but I noticed that, and also know right that you you oftentimes work in a multi display environment.

46:40

Can you say a little bit more about what a multi display environment does for collaborating with people around boundary objects.

46:49

Yeah, for sure. So you'll see a lot of the pictures that I showed were in what's called the DT drum, so it's an Eric is there right now so he's showing you, so there are.

46:58

Yes, so there are multiple different display options, and one of the, one of the concepts that we've been working on is this idea of focus stalking. And so this is a term from computational photography this idea that if I'm pointing my camera at a flower,

47:14

I can engage in different levels of focus, I can focus on the stem or a leaf, or the plants behind the flower.

47:22

But then in computational photography they do what's called focus stacking so then they kind of put all of those photos on top of each other and everything is in resolution.

47:30

And so that's that's kind of the idea behind our youth have multiple displays the DT drum is one example of that. And the Global Learning metrics.

47:42

Kind of straw man visualization that I put up did that as well where we had, we had like environmental data on one screen we had economic data on one screen we had life expectancy on one screen we had, you know, children, we had, how many households have

47:56

internet on what's so we had, like, a lot of these different slices of data to try and create this like multi dimensional view of what it looks like in one like one country versus another and that particular example.

48:12

And so in that case and in cases like that or with the education model we were trying to say okay not just looking at, at rates of post secondary attainment on this screen, but geographically we're in Arizona, are they on this screen.

48:27

And what does economic output look like on this screen and so, so having multiple displays available to us helped us kind of focus stack, in a way where we could attend to multiple different parts of a conversation at the same time.

48:40

That's awesome. I'd never heard of that concept using this before. Thank you.

48:49

There's another one in the chat.

48:52

How do you determine the levels of partners you work with, for instance in the little projects where their teachers or families involved. This is a great question.

49:01

So for our initial foray into a little tin project, we wanted the, we needed the buy in of school district administrators, and so we began our series of engagements at the time was with principals and vice principals and a couple of the district administrators.

49:20

But one of the things that we brought up that we can critique ourselves in that process was should this collaboration continue and we kind of had our first, what we call season, our first season, which came to an end.

49:34

About a year so while it was before comin so a little over a year at this point.

49:39

But we, we discussed that explicitly that that we're this to continue on, we would want teachers in the room, and we would want other people in the room who at that point we're kind of data, plus on a screen, but we're not physically represented in the

49:53

room to share their stories of of why they thought the, the trends on the screen might look the way we did and so. So the short answer is for that series of engagements.

50:04

It was principles and Vice Principals because we needed their buy in, especially because they were the ones sharing their school data with us.

50:12

But if it had continued on. And this is still up in the air of for future collaborations with them. We absolutely wanted teachers in the room because one of the things we were talking about with teacher assessments and how we're teachers managing students.

50:26

And so we we absolutely would want their voices if we continued on

50:37

trying to keep up with the chat.

50:43

Yeah, systems archetype seeing different lenses, that's interesting.

50:49

Yeah, similar metaphor yeah different microscopic magnification.

50:53

Yeah,

50:59

and additional object as an alternative boundary object.

51:05

Thank you. Yeah, I'll check that out.

51:21

I'm one of the students here is very excited about all the things we did wrong. And we'd like to hear more.

51:29

More about what we did wrong. But I think with a good intention in terms of what did we learn, yes I was wrong.

51:38

And so, yeah, I think one of the.

51:44

But do you know, Eric you are you are my partner on most of these projects was there something that you think we did really wrong, but in a good way. Let me do mine so you can sort of think through a couple to, but one of the things I think we did wrong

51:57

regularly at decision theater was presenting to people when they'd come and show them all the wonderful models that we could do and then Shepherd them out of the room.

52:07

As soon as we were done presenting instead of leaving space for them to think about, well, this is what we made this is what that made me think of these are the options that I now see the world differently because of what we didn't leave that space for

52:20

the CO production in the conversation. And a lot of the times that we introduce things that decision theater. And so now we really think about sort of how do we sort of develop this to be more interactive and to allow for them to be curious within this

52:33

environment and then to capture that curiosity. That's something I think we did wrong, we learned from.

52:40

Yeah, I think that's a really good thing to remind about because, and it makes me think of something that I know Joshua has talked about, and Michael I don't know.

52:50

I know Java for sure it's talked about feeding forward.

52:53

So like before engagements, how do we kind of prime people for what we're going to talk about and Eric I think we like kind of figure that out at the very end with the Chief 60 brief, so that we kind of wrote in layman's terms, but but yeah this idea

53:07

of like giving collaborators homework, in a sense of like this is what we're going to be talking about here's kind of a cheat sheet, what you're going to see because yes sometimes when people walk into the room and you've seen some of the pictures.

53:19

And you see all of those screens and all the data, it can be very overwhelming it can be very intimidating, especially for people who are also just kind of trying to make sense of some of this data and we're not the experts, but sometimes it looks like

53:32

we are because we're the ones standing in front of the screen.

53:35

So yeah, so this idea of creating a more less intimidating more welcoming environment. Yes, I think we could have done that better.

53:58

I think our entire class is waiting for the after session with you and Chelsea that we get to talk with you sort of more directly. Are there any other questions.

54:11

Okay, then why don't we sort of wrap up the formal part of the making complexity. Understanding complexity series, and thank Margaret for the amazing talk in so

54:25

I don't know if you can hear the class over what I'm saying.

54:29

But they did.

54:30

I just, yeah I so are you just realized I didn't put my email address on anything I just put my email address in the chat if you want to follow up on anything, or have other questions or want to know about any of the other projects, please feel free to

54:42

email me.

Interactive Summary

The speaker discusses strategies for community-engaged research, focusing on projects like Achieve 60 and continuity of care for mental illness. She highlights the role of data visualizations as 'boundary objects' to facilitate knowledge exchange, the importance of iterative stakeholder inclusion, and the need for ethical collaboration that prioritizes community needs and long-term relationships through the use of a Knowledge Exchange Playbook.

Suggested questions

4 ready-made prompts