tv Public Affairs Events CSPAN November 8, 2016 6:00pm-8:01pm EST
artificial intelligence doesn't learn like a human being. it learns differently, and there are different ways to teach computers, but none of them are putting them in a classroom and giving them a lecture and then taking them into the field and trying out a few dry runs. we learned the old ways we taught the system don't work on computers. the first point i want to stress is we see a chasm opening between the ability to deploy the autonomous systems and the capability of teaching them what the rules are. obviously, that gap will close as computer systems continue to develop. and that is quite possible, but to be fair i think the military hardware is outpacing the ai side currently.
that's my first point. my second point is it's going to be relatively easy to field. again, we discussed this in e-mail before the panel. most autonomous systems today are still stationary, both. why? because movement for autonomous systems is complex. it's difficult. you have object avoidance. you have lots of different types of capability. now the most oldest autonomous weapon system on the planet is the land mine. now some people would say it's semiautonomous because of the way it works, but if you want to go into detail, if you take the acoustic land mines deployed in the 1950s and 1960s, there were small computers on board on enemy ships and they would only target enemy vessels with a
particular acoustic signature. they were actually, i think, the first really solid autonomous weapon systems in the world. those have been around for a long time. however, they don't actually go around and try to find targets. that adds a level of complexity, which is huge. so the autonomous machine guns connected to land radars, which the south koreans use and other countries, those are staying in one place. when you go into a territory where a machine has to learn the environment -- and this is a very complicated machine view experiment. looking at territory it does not know and identifying a human from non-human and friend from foe and to do that, that is a complicated experiment. now, i say that with one final comment. i'm not even 100% sure the rules
we have actually fit robots. and i'll explain why. you see, we built the rules for combat today for humans, and they come with a few hidden assumptions. one, human beings make mistakes and we're okay with that. we accept a certain level of risk in combat for human soldiers. you're allowed to make a mistake if you're a soldier. it's not a war crime to make the mistake. it's a war crime to do something really nasty. i tell you a very sad sortory. in one of the israeli military operations 14 years ago, there was -- the terrorists had fielded one-ton ieds you shouun roads to blow up tanks. and tanks couldn't withstand the blasts and they were blowing up. one israeli tank was traveling in the location and they were
really on guard for that event. sunday they had this huge boom from the bottom of the tank. they were sure they had gone over an ied. they were searching for the turret. they look into the periscope and they see two people running away from the site. they shoot them and manage to hit them. only ten minutes later did they realize it wasn't an ied. the tank had gone over a huge boulder, which hit the bottom of the chassis of the tank and it sounded like a huge explosion. the two people were innocent. the reality is in a combat situation the crew had killed two innocent people because they were in a combat situation. there was a military court-marti court-martial. they weren't found guilty.
are we willing to give computers the benefit of a mistake? now, remember human beings get self-defense as a defense in criminal proceedings. are we going to give autonomous self-defense, the defense of necessi necessi necessity? all of our system is geared for human beings, so the bottom line is not only is it difficult to train the robots for the rules. i'm not 100% sure the rules are ready for artificial intelligence. >> i'm processing that because i think you're onto something obviously that's extremely important in terms of this whether it's a challenge to develop new rules to deal with ai or new rules to deal with an even broader category and what the expectations are.
my sense is we can all learn a lot in terms of how we think about this and how we might think about it at coming at it from the liability side rather than trying to define autonomy or non-autonomy. non-autonomy should be avoided and how do we go about it. i think if we go about it in a case law sense it's enormous. i want to come up with a number of things that you said. does it from any one perspective -- is the distinction between stationary and mobile an important distinction? if one thinks about prohibitions or what could be avoided, does it matter? and then relatedly, the defense
of one's territory versus action out of one's territory. do you want to jump in on those two points? >> can i say a couple of things? >> sure. >> i think the feel was that stupid autonomous systems will be deployed before the actual intelligent ones, so that is not acceptable. well, that is actually presuming, i feel, the sting of the weapons and the people who are deploying them are doing it in an irresponsible manner. that's a field which is there, but i think that's way the inductions of the technology in the armed forces are done.
we have to look at what is inherently wrong with fully autonomous weapon systems. it is not autonomy we're campaigning against. it's fully autonomous weapons systems and meaningful human components is what's being looked at. and there is a weakness in what does this fu-- is this fully aus weapons system. weapons systems that are fully autonomous are those which can select and engage targets without human intervention. that "and" is important. what is meant by selection? only selection is acceptable. only engagement is also acceptable. only selecting is also acceptable. but selecting and engaging is where the question is being drawn, and the reason behind
that is between the selecting and the engaging there's a decision point. and that decision to kill is what is felt today that -- the machine -- the decision should be left to human is one point of view. i would like to make the point that if we're looking at various technologies, the kill chain as we talk about where you first identify -- you navigate to objects -- in fact, if we look at the narrative of 2009, it specifically brings out this. in all these functions, autonomy
is permitted. nobody would even object to it. it is only the decision to kill. the point which i really want to make is the complexity of ai is not going into that decision loop. that decision loop is an aspect of it. as long as a human is there, there's no technology involved in bringing the human into the loop. that is one point, which i wanted to make. because if we're thinking in terms of banning technology, it is trivial as far as the technology is concerned. coming to the question that you asked. defense and offense actually, my military person would know it's not defense and offense. there's no difference between --
there's an aspect of mobility coming into it. the autonomous systems which are meant for defending and going into offense would really be of the same nature. i don't see any difference between the two. >> what about territoriality? you can avoid that distinction by saying you can operate it on your territory, but not outside your territory. >> okay. i'll elaborate a little more on that. i brought out this aspect of conventional warfare vis-a-vis four scenarios. i'll take an example from india. if you have a line of control, there's a sanctity and it cannot be crossed. if we're looking at that scenario, then if you try to defend, that defense involves --
there's not much more mobility. you could have non-mobile robots looking at the defense. when we have gone into a conventional operation, then when you're talking about defense, you're also talking of mobility. so you attack. what i'm saying is depending on which backdrop you are looking at, defense may or may not involve mobility. that's why i'm saying that in general to try and draw a distinction between offense and defense may not be very correct from a technology point of view. however, it would be more acceptable to those who do not want to delegate to the machines. >> i'll ask daniel to comment on
the territoriality thing. i'm thinking of the iron dome which operates over israeli air space. then there's the wall. i'm thinking of analogies because the general is talking about the line of control, which celebrates the part of kashmir that india controls. you can imagine that kind of boundary being a place where one might put autonomous weapons to prevent infiltration that's not supposed to be coming across. on the other hand, presumably, like the last month when india's -- there's movement going back and forth, you might want to turn those systems off
so you don't hurt your own people. given israel's experience and your experience there, does the distinction of territoriality matter practically or legally or no? >> if we take, for example, the iron dome system, it has been made public the iron dome system has three different settings. you have the manual setting, the semiautomatic, and the automatic setting. and it's a missile defense system, right? and the idea is you want to shoot down the missile over a safe location, so part of the algorithm there is to know the israeli system works like this. then it calculates what it is going to hit because it is on a ballistic trajectory, so it's not going to deviate from its track. you automatically do a lot of
things. then it calculates if it's going to hit in a dangerous place, where to shoot it down so that it minimizes damage. theoretical theoretically, boundaries are not relevant for that. you can catch the missile early enough, we wouldn't care if it landed in another country eas'sr country's territory. the idea being that the system is not supposed to take boundaries into consideration. it's supposed to take human lives into consideration. my gut feeling the stationary versus the mobile issue is just a technological difference of complexity and the geography is not a real issue. although again following the general's footsteps, i think people will be more easy to accept the fact you would feel
such things in your territory when you have sent them into another country. on the moral public opinion side, there are arguments to be made that these are additional steps down the road. but from a technological and even from a legal side, i don't really think that there is that distinction. i don't think it holds. >> on the complexity, let me mention again the systems which we'll be targeted, let us say mobile targets, would be many more times complex. let me just paint a picture. again, i'll take this example from conventional warfare. for example, you have in an area of 10 kilometers by 10 kilometers a battle.
it is a contested environment where there are no civilians present. now this has to do with military capability. here are the models. so today how this battle would be fought is another 100 tanks would be contesting amongst each other. the blue forces would be destroying the tanks, so they're on par, the two states. let's say one has ai technology and you have piloted autonomous armed drones instead of tanks. now i'm trying to analyze as to what is the complexity as compared to today's technology of these armed drones picking up those tanks and destroying them. i think the complexity gap is hardly anything for technology, which is there.
drones are already in place. you only have to pick up tank signatures in a desert. so if a country develops a military capability, those lives would be saved. in such a scenario the complexity is not there. the complexity is there where there's a terrorist, which is mixed up in a population. it's mixed up in a population and sort of distinguish between a terrorist -- there's no external distinction at all, so that's a complex problem. i just wanted to comment on that complexity. >> mary, come in and sort all this out for us. >> thanks. i'm just thinking about the international talks we've been participating in the last three weeks of talks over the last three years. they look for points of common
ground where the governments can agree because there are about 90 countries participating. at the last meeting they said fully autonomous weapons systems do not exist yet. there was pretty widespread acknowledge that what we're concerned about, the lethal autonomous weapons systems, are still to come. and the other thing the states seem to be able to agree on is international law applies, international humanitarian law applies, the testing of your weapons and doing that through article 36 of course applies to all of this and the kind of notion of what are we talking about. we're talking about a weapon systems that selects further human intervention. what they haven't been able to do yet is break it down and really get down into the nitty-gritty details here. i think that's where need to
spend at least a week to talk through the aspect of the elements or the characteristics that are concerning to us. is it mobile rather than stationary? is it targeting personnel rather than material targets? is it defensive or offensive? although those words are not so helpful for us either. what kind of environment is it operating in? is it complex and complicated like an urban environment? are we talking about out at sea or out in the desert? what is the time period in which it is operating? because it's no coincidence this campaign to stop killer robots was founded by people who worked for the campaign to stop interpersonal land mines because we're concerned one of these machines could be programmed to go out and search for its target not just for the next few hours, but weeks, months, years in
advance. where's the responsibility if you're putting a device out like that? that sums up the break time we need to have in the process to really get our heads around what are the most problematic aspects here because not every aspect is problematic, but that will help us decide where to draw the line and how to move forward. >> if states have agreed that the laws of armed conflict and international law would apply, it seems to me that's a different circumstance than if they don't agree. dan is shaking his head. tell me why you're shaking your head, but pick up on this too. >> mary's absolutely right. you know, when i grew up, there was a u.s. band called supertramp. >> sure. we're dating ourselves.
probably when you were a teenager. >> one of my favorite songs when i was growing up was the opening lyric "take a look at my girlfriend, she's the only one i've got." we don't have a plan b. as a very old-time international lawyer who deals with this issue, i don't have an alternative set of rules to apply to this situation. so we have no choice but to say at the international convention we will apply existing rules. we don't know how to do that. >> right. >> that's one of the problems. the rules don't work as easily on robots as they did on humans and they don't work on humans as easily as we thought they would. in reality when we'll be asked to translate that into reality, we'll have a huge new challenge.
that's one of them. >> let me jump right in on this and we can continue it as a conversation. that seems to me one of the strongest arguments for at least a pause, if not a ban, a moratorium, to the extent what you just said obtaining. the argument is let's wait until we can sort this out then. tell me what's wrong with that, or if anything, whether the problem is it is not practical, but from a legal point of view. >> i am also a cynical international lawyer, okay? and the reason i am is because i used to do this for a living. international law is often a tool and not an end. if you look at the list of the countries participating in the process, you will not be surprised that the primary candidates for fielding such weapons are less involved than the countries who are not supposed to be fielding those weapons. in fact, if we take the land
mine issue as a specific example, the countries who joined the anti-personnel land mine regime, the world is divided into two groups. wi as a result, it is not a rule of international law. it is only binding on the member states, which creates a very bad principle of international law, which is international law is different for every single country. this is part of international law. it is how the system works, but it is one of the fallacies of the system. for example, for canada, it's unlawful to develop or field an anti-personnel land mine, but for israel it is totally legitimate to do so. if israel and canada could fight, canada could not, which
shows you how stupid international law could be. i say that to tell you what happens with autonomous weapons systems. i know who is going to field them. the countries are going to field them are not the countries that are going to be administering any type of results from that process. and the last thing i want to have happen is the normal countries who have very complicated projects and approval processes for fielding weapons like india, who came up with the robotic revolution 15 years ago, they took this problem on board as one of the issues they need to tackle with. i would trust them much more to handle this issue effectively than a country where i know they don't care about the collateral damage as much. so my problem with the proposed ban is -- my concern is it will achieve the opposite result. the good guys who will take care
only to field systems after they know they can achieve all of the good results we think they can won't field them until they're ready with a small mistake probability. but the other people will field them earlier, and that is not necessarily a reality i want to live in, so that is where i come in on the discussion. >> mary, how do you respond to that? >> treaty we're talking about is a geneva based convention. all the countries that are interested in developing autonomous weapons technology are participating in it, so nothing would be discussed in this body without the agreement of all of these countries. we do have china, russia, the united states, israel, south korea, and the u.k. in there debating this subject. and just to come back on the land mine side, we do have 162
countries that have banned these weapons. we have gone from 50 countries producing them down to 10 as a result of the international treaty. and the international treaty includes former major jusers an producers. we're not talking about doing a land mine's treaty here on autonomous -- not yet, anyway. we're talking about trying to deal with it within this particular framework. we want this to work. because if we can't do this with everybody around the table, you might end up doing these other efforts. at moment, at least there's consensus to talk about it. there's not consensus on what to do about it yet. >> what has been the thinking of a moratorium as distinct from a ban? i say it for the following
reasons. if there's also the possibility that smart versions of these weapons could be more discriminating and have other positive values from a humanitarian and other point of view, then an indefinite or permanent ban seems to be a priority one might want to question. on the other hand, because people stipulate that they don't quite know how to apply international law and other things to this argument for a moratorium until that's worked out, just in a bar that would make sense to people, i think, which is how i try to think about things. take me through the moratorium versus ban. i know you're working on a ban, so i'm not asking you to endorse something you're not working on. >> the moratorium call came from a u.n. special effort.
hines issued a report in 2013 in which one of his major findings is there should be a moratorium until the international rules of the road were figured out here. when he was on his way out, he issued more reports calling for a ban. that was his initial position and then he moved towards the permanent ban. we haven't talked about a whole lot of the other concerns that are raised with these weapons systems, but the moral concern you're ceding the responsibility of taking a human life is something we take issue with. we're already seeing the effects of weapons with some degree of autonomy with them. and they don't want to cross that moral line. there's a lot of countries talking about security and stability and what happens when
one side has these weapon systems and the other doesn't. what's the nature of -- what does it do for the nature of conflict and for war fighting when you have one side who has all the high-tech nice technologies and who can use that and the other side that cannot? the question here is are we going to level the playing field so everybody has these weapon systems or is it better that nobody have them because at the moment there's still time to sort this out here. there's still time to put out some rules and there's still some time to prevent the fully autonomous weapons systems from coming into place. >> i think you used the terminology fully autonomous weapons system. if we put a moratorium only on the fully autonomous weapon systems, which again applies a human in the decision to kill,
so we are not putting a moratorium -- i mean, this is proposal is not trying to put a moratorium on user ai autonomy and all the other functions if the kill chain. essenti the decision to kill does not require ai.n the kill chain. the decision to kill does not require ai. it is just an implementation problem and how that system works on ground. in effect, nothing will happen on ground because all the enduring technologies will get developed. the last part what you said was on -- >> in terms of who has these weapon systems, the haves and
the have-nots. >> that rationale, the whole idea of developing this technology to have that military capability to have predominance over adversaries, that logic cannot be applied to a particular type of system per se. the idea of this new technology oh than having a technological edge over adversarieadversaries brought out bringing ai into weapons systems is going to lead to a cleaner form of warfare. even vis-a-vis the standard bombs being dropped from aircra aircraft, ai is better.
more intelligence, so more discrimination, even if you don't have aspects of empathy and judgment at a much later stage, it will lead to more precise targeting of what you want to target. to that extent, on the one hand you're building military capability. on the other hand, you're leading to a cleaner form of warfare. i would say just saying moratorium is not going to lead to anything positive on ground. if ultimately the conventions decide from other points of view -- you look into the future and this aspect of ai taking over the human race, et cetera, if you're looking at that perspective and from that point of view in developing the ban of technology at this stage, that
is worth considering as a point, but not from the issue of the decision to kill. i'm really speaking how i feel. >> dan, i want to open it to the broader discussion here. >> i think the point i want to make is there are several different agendas, all legitimate, at work here. one school of thought says we're not ready to field such fully autonomous systems yet. i think they are currently right. i think we haven't solved the technological requirements to make sure the statistical accuracy of our systems in a complex system, not in the simple one you have said general, but i haven't heard of anyone who has solved the ai problem of doing that. yet it requires so many schools of technology. it requires the target identification. remember, this is in a combat situation.
it would need accurate target identification in complicated environments. you need a machine to be able to do so under a lot of stress, physical stress. lots of challenges which i call them technical, but they're really intelligent technical difficulties. but they will be solved. i'm a million percent confident they will be solved, but just not today. one group is saying wait before you allow a machine to push a button that kills a human being. that is one group of thought. another group said something wiser. we don't want machines to kill people, period, irrespective of how good they are in doing it. we don't think this should take place. now this is a moral philosophical important discussion on a totally different level which has nothing to do with the technology involved. i will point out here that we have already undergone a partial robotic revolution in the
civilian sphere. they've become invisible already. but if i go back in time, one of my favorite stories is the first elevators in the world were built in chicago when they had the first high-rises. like you saw in the old movies, there were elevator operators who used to operate the elevator. what happened is they built a high-rise which was too high for human operators. it had to move too far. they built the first ever machine operated elevator in chicago. now the problem was that when people walked into the elevator and didn't find the operator they thought it wasn't working. they put the sign. i have a copy of that sign explaining this is the first ever machine-operated elevator. it's perfectly safe to use. no one would use that elevator from the beginning because they thought it was unsafe. how could a machine know where to stop? an elevator is a very primitive form -- today, especially with
the quite complicated software you put in them of an autonomous machine that can till you. traffic lights are autonomous systems. if they make a mistake, people can die. we have long time accepted the fact that computers can make decisions for us which can kill us. what's happened for the first time is we have reached the stage where we're thinking about they can do it on purpose. this is a decision point which we need to decide if we're crossing it or not. being the cyncist i am, i think we have already crossed it. i'm happy we're having the discussion now and not 20 years from now. and the final school of thought, i think the general voiced it perfectly, is a question of do we want cleaner wars. there are two schools of thought on that one. one saying the more accurate
missile systems -- coming from israel, remember we are the advocates of accurate missile systems because the less civilians we hit, the less israel is targeted for doing something wrong. we have a vested interest in using more accurate systems. there is a legitimate counterparty saying part of the reason why there are not so many wars is because it is dirty and civilians die, et cetera. if you just kill the combatant, you'll be happier to go to war. i'm not saying i agree with that position, but i'm showing you the different schools of thought which are converging around this issue. each one is a separate discussion, and you need to choose which one you want to focus on at a given moment. >> that was a great summation and taxonomy of the discussion.
let's open it to discussion. you all know the procedure. i call on you and then you say who you are. somebody will bring you a microphone. there's a lady here about midway and then the gentleman you're walking right by. ladies first, at least for the next eight days or until january 20th. >> i'm from the center for naval analysis. i was wondering how you think this discussion applies to cyber warfare, particularly in scenarios where cyber weapons could be lethal. >> actually, the cyber domain is very much part of this discussion of autonomous systems, how autonomy should come into play as far as warfare is concerned.
the current heated debate is about human lives, killing human lives. and cyber while in a sense can effect human lives, but in a -- so you're talking about a cyber attack. an autonomous response from the adversary who killed that attack which was coming up, that's very much part of autonomy playing part of this warfare in the cyber domain. but there's no objection to that. to that extent, i think that field is getting developed and will progress without any legal and ethical issues involved in it. that's what i would say. >> dan, you want to jump in on that? >> yeah, i actually think it's part of the discussion. one of the reasons i say so is because i don't actually know where cyber starts and kinetic
begins anymore. i used to know. i don't know anymore. one of the discussions, for example, we've been having on fielding robotic systems is what type of protection do you want to give those systems against being hacked because -- example, one of the ideas that came up in a discussion a few years ago, maybe we need to create a kill switch, which you can turn off a malfunctioning -- we call them w.a.r.s., weaponized autonomic robot systems. i agree cyber attacks are generally not focused on killing human beings, but indirectly they can do tons of discussiama. there's a subject of cyber
autonomy which is scarier than anything we've discussed so far that in the cyber world there is a possibility of self-replication. we don't know how to create an autonomous fighting vehicle, which will create a copy of itself and go out into the battlefield. however, we already know how to do that with computer viruses. i actually think the cyber autonomy world is even scarier because it has the potential of us losing control more than the kinetic side, but that's another issue for discussion. >> it's halloween, so scary is okay. the gentleman right there, burt, and then we'll go back. >> i'm a professor at george mason university. i want to pick up on a point that dan raised about sort of varying levels of autonomy that we have in technology currently
and that we're almost on the cusp of different types of autonomous systems that can take lives. and i want to point one that already does and that's self-driving cars. >> yeah. >> right? they make moral decisions to kill. they're going to crash as a matter of law of physics or statistical probability, and they need to be programmed to make a decision that is a life and death decision. i would like to hear a little bit about maybe some of the distinctions the panelists see between this type of technology and lethal autonomous weapons. >> i've done some work on that, and you mentioned that in your opening comments. the short answer of course no one has a good answer what we're supposed to do with an autonomous car, right? being a procedural lawyer, the question then becomes not what do we do, but who is responsible to do it. so we now have a discussion which goes something like this.
option number one -- this was in a discussion two weeks ago, by the way, with some of the companies that do this. option number one is you allow the guy who buys the car to make the decision when he buys the car. when you choose the color of the car, by the way, would you rather commit suicide if a car encounters the following situation or would you prefer not to, to, sir? one of the people in the meeting said let's agree we give different colors to those cars so you know who they are on the road. it's a true discussion. that is one example to say, no, the car comes hard wired with a decision. do we tell the people who buy the car what this is? you can't because it's an algorithm. it is way too complicated to
explain. of course, the car won't automatically kill you. it will go through a process of decision making, and it will do its best efforts to come up with whatever the guy who wrote the code told it to do. there's no way we can summarize that in a way in which the customer will understand. i'm taking you through this because when we try to move the analogy to the warfare side, the main difference is on the warfare side this is all intentional. but the reality on the warfare side is the distinction part. when you have different people on the battlefield and you want to identify who is foe and who is non-combatant, then you need to find a way to optimize what you're going to do so that you hit those. it is actually exactly the same question if you take away all the fluff.
then the questions arise of who is going to make the decision. are you going to ask you the commander to tell you in advance what level of civilian casualties is acceptable, which is option one. this is easier for me to say, for example, because that's how military operations work today. or are you going to allow the manufacturer of the autonomous weapon system to hard wire that into the system and me, if i went back into my military career and i'm back being a colonel in the israeli army, i have no idea what it is going to do when i press the button. i have no way of controlling that. so the questions are exactly the same, although the scenario is different and i think you're right. i think we're facing the same dilemmas now on the civilian front as we're going to face on the military front in the
future. >> i think one of the things which is happening in these discussions is that we are talking about autonomous systems in general. there are grades of autonomous systems to be used in different contexts. in today's context that may appear to be simplistic, but in yesterday's context picking up tanks with an autonomous vehicle was not a simplistic affair. a simplistic situation is you tell your autonomous systems all the enemy air feeds. you said that was an easy system. the next less complicated one is what i painted as tanks. when you come closer, i can
paint another situation where a company is going in for an attack and there are bunkers. that's more difficult closer situation where it could now end up in a close quarter battle where aspects like empathy would come in. this is a more complicated situation. the broad point, which i'm making is, what we are banning or what we are deploying, let's not talk about banning, has to be in a graded fashion. whatever technology level is reached to that extent that autonomous system should be permitted to be deployed in a responsible manner. already, there are autonomous systems on ground.
they're been there for decades. you painted mines as the most primitive autonomous system. there's a convention against mines for similar reasons. let's not talk about mines. they're already there. as you perfect technology in a responsible manner, they should be deployed. rather than talking in general, the moral aspects of the question that were raised just now will come at a much later technology. if all the autonomous systems can mimic the empathy and judgment, that will be much later. if it is perfected to that extent, that brings me to a second point. that is about who is accountable. the point of accountability was raised. is it the developer, the manufacturer, the commander, the state? different levels of accountability. i wouldn't say that if an
autonomous system malfunctions on ground the commander and the state -- the state in any case cannot absolve itself of responsibility. it is responsible in every case. but even the he is responsible. why is the system, the rollbacks of the system. it is within that bonded activity of the system that is supposed to deploy. if it malfunctions, it is -- the valuation aspect is -- has to be very strong. more importantly, more complex scenarios in the systems. >> i suspect, as with the vehicles, so, if you move this direction with military systems, that latter point will be more debatable. in other words, do we want cleaner wars is the question? do we want fewer traffic fatalities? the answer may be yeah, but i would rather -- it's easier to
be in a system where the driver and soldier are accountable than even if it's safer and cleaner, but now a big supplier is accountable or the state is accountable. it's interesting the issues this is going to bring up. including financial reasons that i would rather not take on the liability, i would rather you have the liability. brave new world. this gentleman here in the middle. yep, then why don't we take two. this lady with the blue and white striped shirt here. if we can get another microphone to her. let's take two in the interest of time, we are bumping up against it s. k hi. keeping in the theme of things that are scary. we touch on autonomous deterr t deterren deterrents. that could be where you put in an input of what to do the
second strike nuclear attack or in the realm of cyber, you launch an attack before the systems go offline completely. the question is, how do you integrate the questions about deterrence and autonomous weapons systems and the effect where you don't tell your enemy you have these capabilities for operational security reasons, but make it likely things will get out of control. >> let's take -- while you are thinking about that, let's take the other question, then parallel process. >> my name is lauren green, i was a holistic essay assessor and scored the test of english of foreign language for 6 1/2 years until an algorithm that became a human reader. i'm becoming a journalist. my question is, are you not crucially or aware that artificial intelligence to make
computers think as humans is destroying our reasoning process because we are granting these machines so much importance to council our own reasoning out of the process? >> wow. i didn't do that well on my s.a.t.s. i'm not sure i understand the question. i'm trying to process it. >> trying to create a system where a robot serves as a person about when to strike, where to strike and how to strike. the process, maybe a computer system to reason when it would be appropriate to strike. so, we are granting this algorithm we are creating more weight than our own thinking and spontaneous thinking. it's counseling our own ability to think spontaneously and reasonably even as demonstrated today with the explanations that you have provided and maybe lacking a real critical target
in your arguments. there was a lot of just open processing without really making a definitive, in some cases, answer. also, the process for deterring autonomous weaponry is entirely too slow. i think most people are critically aware, there's a lot of apathy against -- a lot of apathy against all together cancelling out the prospect of fully autonomous repry. i'm wondering if that's because so much money is invested in the artificial intelligence process and not enough in human capacity. >> okay. >> the first part of the question was about you are
delegating, you are saying a machine can be more reasonable, make more reasonable decisions and be safer and the correct decision better than a human? >> i think the opposite. >> she's questioning that. >> she's asserting, not questioning it. she's saying what it is. that it's not going to work. we are destroying our own capacity to reason and think by pursuing it. >> okay, is she saying a machine will never develop to a state where it can do better than humans? isn't that what she is asserting? >> yes. >> that's for the scientists to, you know, say. >> why would we ever want that? >> it's not wholistic technology.
it's whether it will be able to understand national language. you will see what is happening today. so, the aspect of reasoning, my own belief, with a layman's knowledge of ai is that anything that the human mind can do, other than aspects of, including empathy, mimicking judgment at any level, it is not far off to develop to that stage. there is no scientific reason to believe otherwise. >> it is only mimicking judgment, it's not rationally judging. >> dan, you want to jump in on this? >> i want to talk to you about the two questions together. it's all a question of delegation. you use that word in your introduction. your questioning whether it's right for some forms of decisions to machines. ef the assumption it's a bad
idea. i do not agree with you on every scenario, but i think it's a good question. you went a step further, should we delegate authority to use significant amount of power in a disastrous situation where human beings may not be able to respond quickly enough, effectively enough or intelligently enough to counter attack or whatever. these are great questions because they raise the question of what are we developing ai for? okay? now, it started off, if we forget the first few years when it was a scientific experience, it's supposed to be something that makes our lives better and easier. it's the idea behind this subculture. so, for example, if it can make a good decision quicker than a human being and save a life, most people would say that's a
good thing. and, as we are seeing technology develop, i, personally, being a tech know logical layman working in this field, can tell you i have seen numerous examples where computers are better than humans at making a decision. human beings are scared, they are tired, they don't have all the information and sometimes act on what we call instinct, which turns out to be a subculture thinking program which is good and sometimes really, really bad. it may not always be a good thing to delegate authority to a machine. i think the decision we need to make is where we agree that the machines come to help. your scenario, an extreme scenario, i would rather not let the machine make that decision. but, i can definitely identify
parts of life where i want machines to help me out. i like the fact -- but i do not want them to replace us in the things which i care about. this is the type of discussion i think we should have now, before we let technological companies and market pressure push us in a direction we are not necessarily willing to go. >> if no one else -- go ahead. >> just to say, we have quite a bit from the artificial intelligence about how it can be beneficial, this is the big catch phrase and they are investing money to determine ways it could be den official to humanity, but delegating the authority to a machine on the battlefield without the human control and the line they draw, we haven't talk eed about policg or border control.
we are talking armed conflict at the moment. this is not just in the realm of armed conflict. it's much broader than that. the point at which it is weaponized is a broader, bigger debate. we don't have all the answers to much of it. >> well, i want to thank, obviously the panelests but all of you for at least here beginning the process of this debate in helping us really hone in on what some of the key questions and issues are. thank you all again and thanks. [ applause ] >> thank you all for coming for this part of the carnegi carneg.
in the meantime, i encourage you to download the carnegie app with content of the latest analysis. last but not least, join me in thanking the team that helped with this event, lauren and rachel who helped with the organization. thank you very much. tonight on american history tv, victory and concession speaks. beginning at 8:00, the 1980 election, president jimmy carters concession speech.
9:30, george w. bush's and ross perot. also programs on presidential leadership in the 1789 debate over the title firefighter george washington and subsequent leaders in the u.s. all of this on c-span 3. election night, tonight on c-span. watch the results and be part of a national conversation about the outcome. be on location at the hillary clinton and donald trump election night headquarters and watch victory and concession speeches in key senate and house races. starting live at 8:00 p.m. eastern and throughout wednesday. watch live tonight on c-span, on demand at c-span.org or using the c-span radio app. we'll be sim ul casting the live coverage of elections and
how they are reporting on developments. that will be at 8:00 p.m. eastern. as the nation elects a new president on tuesday, will america have its first foreign born first lady since adams or have a former president as first gentleman? learn more about the influence of america's presidential spouses from c-span's first ladies. now available in paperback. it gives readers a look at the personal life and impact of every first lady. it is a companion to c-span's biography series. each chapter offers brief biographies of 45 presidential spouses and archival photos from their lives. first ladies in paperback published by paper affairs.
fbi officials join activists and others to talk about gun violence, campus safety and improving relations between law enforcement and students. some of the nation's black colleges and universities hosted this event. ladies and gentlemen, please welcome to the stage our education and justice bridging the gap between law enforcement and the hcbu community panel. please welcome our panelists, calvin, special adviser for campus public safety, department of justice on detail to the fbi. [ applause ] please welcome dr. nancy rodriguez, director of the national institute of justice.
>> please welcome curtis johnson, president of the hbcu law enforcement executives and administrators. [ applause ] >> please welcome kathryn lamon assistant secretary for civil rights u.s. department of education. [ applause ] >> and please welcome our facilitator, dr. michael sorel, president of paul quinn college. [ applause ] >> i think our panelists can have a seat. i was tempted to have them do this whole thing standing up. that seemed a little harsh. we have a tradition at paul quinn college where i greet everyone by saying good morning quinnites.
we are going to adopt -- i realize not all of you are quinnites. how about this, good morning, family. >> good morning. >> all right. let's do better. i didn't hear the people in the back. good morning, family. >> good morning. >> was the attorney general amazing? [ applause ] >> i hadn't had the privilege of listening to her in person before. i knew she was a sister of incredible abilities, but as she sat and went through her remarks, i realized she took every single point i wanted to make and said it ten times better than i could have said it. so i'm going to tip my hat and just, you know, be amazed at her eloquence and her passion because i think that's what we need in difficult times like
these. so, i am particularly proud to be here this morning because i am part of a group of hbcu presidents that are working on the issue of gun violence in our communities. we wrote a letter to america about our views on what was going on and about addressing the pain that our students felt because the reality of it is, the folks that are being traumatized, they are our students. and if they are not our students, our students know these men and women. and if our students don't know these men and women, some of them know they come from the same communities. they have parents. they have brothers and sisters and uncles and cousins who have been in prison. we live with the reality of the prison system in our schools every day.
and to ignore that that is the case would just be unrealistic. we are thrilled to have this discussion because this is a timely discussion. as the attorney general said, this is the issue of our day. and i am proud of our students. in fact, are the hbcu all-stars here or are they still at the white house? they are on their way? well, please, listen, when they get here, my hbcu all-stars, please make sure you let her know that we tried to recognize them because i don't want them protesting me. okay? now, i'm going to stop our chitchat and get on with the business of this morning. our first speaker is kathryn layman. i'm going to allow the speakers to introduce themselves. so, if you would do so, please just, when you get done, pass
the mike along or activate the mike along. we will start with miss layman. >> can you hear me now? it's really such a pleasure for me to get to be with all of you and with my friend and colleagues on this panel. i am kathryn layman, assistant secretary for civil rights at the department of education. that means i am for federal civil rights laws in schools and institutions of higher education and or p-12 system. the three major areas we focus on are race, sex and disability. we are actively engaged in the work of making sure that all of our students live the promise that the attorney general talked about. actually experience in school, respect for their person, opportunity to learn and the -- the ability and the platform to realize their dreams. so, it is such a pleasure to be
with all of you who are doing that work every day who are in partnership with us to ensure our students have that opportunity and i'm looking forward to this conversation. >> thank you. curtis? >> good morning, everybody. curtis johnson, current president for hbcla and i represent the chiefs and security directors of our institutions nationwide. it is a great honor and privilege for me to sit before this body as we have done some good work over the last couple years. i'm thrilled with my colleagues that i harass on a regular basis on behalf of students to have this discussion. i think we started off the ball rolling a little bit in a couple areas that the attorney general talked about briefly. we bought students and chiefs to howard university, 20 chiefs and 20 students to howard university in august to have a conversation about bridging the gap between law enforcement and campus
communities. it was a very spirited conversation. we wanted to make sure it was an honest conversation. we started the day before with a team building process to be able to allow everyone to speak freely. we chose specifically not to include the press to make sure folks can talk from the heart. we wanted to get, i believe to the root cause of the issues at hand. the white paper was completed and released yesterday. i forwarded to the chiefs, first, to have an opportunity to take a look at it and the students so they would have the opportunity to look at it. it is available as we speak on the national centers of campus public safety website so everyone in the nation can have an opportunity to see it. where do we go from doing that piece of it? we are pushing that agenda forward. we wanted to be the organization to have a conversation and lead the nation to put it out and
heal our communities throughout the country. we are taking a few steps to get us on the right bath. i wanted the hbcu community to be the catalyst to get it started. very happy to be here and have this conversation today. i pass it to miss nancy. >> good morning, everyone. i'm nancy rodriguez, the director of the science agency within the department of justice. so wonderful to get a chance to hear my boss, the attorney general of the united states. as director of the science agency, we are responsible for supporting and investing in high quality rigorous research that really is geared to prevent crime and really advance our criminal justice system. so, we support area in an array of important issues. all the way from violence prevention, school safety, human trafficking, radicalization to violent extremism, forensic
science, drugs in crime, gun violence and every component within the criminal justice system, including, of course, policing. we are certainly very committed to broadening the reach of science. since my arrival at nij, we created different mechanisms to support young scholars, to support early career investigators as well as graduate students who i have to say, are, i think, more motivated, more skilled than ever before. i think getting them exposed to the important work that we, as scientists, can do to advance our criminal justice system is so important. we support scholars in an array of disciplines from biology, chemistry, engineering, sociology, psychology and i hope that our discussion is one that
not only exposes you to the various opportunities for faculty and students, but also one that has me walking away with how to strengthen, really, our relationship with another minority serving institution because i see you as partners in this effort. i can certainly and hope to be able to talk to you about what we are doing to support research in the area of policing to strengthen the relationship with communities and also what we are doing around school safety. we have been very fortunate that congress has authorized us significant amount of resources to invest in school violence and school safety. thank you. >> good morning. again, my name is calvin. i'm the fbi special adviser for campus public safety. it's a great day. i'm glad everyone was able to make it here today.
let me say this one point. i have been at the department of justice for 18 years. this morning, the attorney general of the united states said my name for the first time in 18 years. [ applause ] i immediately text my wife. as the special adviser for campus public safety, my main job on a regular basis, a daily basis for all campuses throughout the country is level the playing field. the playing field on hbos, not hbos, hbcus exist on a lot of levels on campus public safety. most of your campuses have sworn personnel, nonsworn personnel contract personnel and also seek help from municipal agencies that surround you. what really happens most of the time, though, is a lot of information is lacking in a lot of partnerships are not present. what i do on a regular basis is promote partnerships between campuses and local law
enforcement and throughout campus law enforcement throughout the country and get them to work with 56 field offices around the country. on two different things, mainly. we try to get them to work on building partnerships to take advantage of the fbis reactive resources, 56 field office, thousands of agents. we have a lot of resources to be able to come to a campus when an issue or challenge happens there, whether it's a chemical spill or active shooter situation. the other part that we have is what i call proactive resources, which is we have training that we come to your campus and talk to your campus personnel about things such as not only active shooter training, but cyber threats, chemicals on campuses and things like that. i'm open to talking about all those different things as we go throughout this conversation here. i would be behooved if i didn't say the threats on our campus.
most of your campuses are really blessed to be in the communities that you are in, as the attorney general talks about, hbcus and communities, you know your university, most likely, was built out of the community that is around it. because of that, you exist in that community and you have a lot of violence and other thing that is exist outside your walls. they may not fall into your definitions, but exist as students become victims of that. we work on behalf of students to get that kind of information out to you. our biggest -- your biggest threat on the campus these days are the threat of violence extremism. the people in this world and this land that want to take advantage of the attitude and tone going on in this country now and the students caught up in the black lives matter and other issues going on. also, leaves their mind open and susceptible to people who come in and just bring up other things to them that they may not be conducive to a learning environment.
the other thing is, cyber threats is the next biggest thing that is most campuses are victims of. those at private universities and state universities, those who see the open environment as a way to move into larger computer systems through your system, if you don't have an active firewall and those sort of things. they are using those systems to enter into state systems and federal systems. we are open to talking about those things, too. we have resources in our field offices that work through all those things. thank you. >> all right. well, we are now going to turn to our first question and kathryn, it's for you. we know that students, regardless of their race, their origin, color, gender, all deserve a safe environment for their education. maybe if you can talk a little
about the work you are doing around campus climate and how that factors into what we are seeing at hbcu. >> great, i would like to talk about stories in each major area. i will spend the most time on sexual violence because that's been a topic that is very much in the news and i think it's very important for us to make sure we are actually living our civil rights promise in campuses on that topic. i do want to touch on race discrimination and disability discrimination because those issues are less in the news and prevalent in the schools. i'll start with a story outside the hcbu context. resolution agreement with an alabama college that we entered into two years ago. i say that to emphasize how recent the harm to the student is, a black student who is an athlete happened to be sitting at the back of the bus and they were planning to go to an event that had been canceled. she asked her coach what they
would do next. the coach told her to keep her black rear end, that was not his word, but that's what i will use here. keep her black rear end on the back of the bus, rosa parks. and that's not all that she experienced at her school. she had her peers say she shouldn't drive at night because she couldn't be seen. ask her to smile so she could be seen. call her that black girl. over and over and over. a hostile environment that her leaders on her campus perpetuated, her peers inflicted on her and her campus didn't take steps to address for her. they now have, happily. they have paid for counseling for her. they are subject to ongoing federal oversight. they have changed their practices. this is a young woman who made it to college. this is a young woman who is doing the things we tell our kids they need to do and when she got there, she was made to feel unwelcome, made to feel like she can't succeed.
that's unacceptable in our society, period. another story, from an hbcu, sorry to say. a young man with cerebral palsy who applied to college and was accepted. he came with his social worker to see what he would need. the school saw he had cerebral palsy and revoked his admission. thank you for your shock. and revoked his admission. they revoked his admission. they reported to us. we thought you must be mistaken. when we called the college, they said we usually take a look at the individualized education plan they have in high school. if we think we can't support the student, we don't admit the student. wow! thank you. so unlawful. so, that young man has been admitted to the college, he's doing fine, he's thriving. they have agreed to submit to my office for three years every student they reject so we can evaluate whether they should have been rejected or should be
admitted to the school. they have dramatically changed their practice. this is a place that should be welcoming to all students and prepared to nurture and support them and profoundly missed our civil rights protections for students. then i want to turn to sexual violence. that has been, as i mentioned, in the news a lot. i need you to know that the facts that we see in our investigations are truly appalling. and they -- they range from a whole variety of the kinds of ways sexual violence can touch our young people's lives. because we are talking here about the relationship to law enforcement, i want to tell you one particular agreement with a campus in the university of maryland system from this summer, which a young woman reported she had been raped by a campus security officer in a campus security vehicle. the school didn't investigate.
they sent the issue to the criminal justice system, which is appropriate. that officer entered a plea agreement that required that officer to no longer be a campus security officer, which is good news. there was sufficient evidence his behavior was not an outlier. his plea agreement made him report on other incidents from other officers. the title ix coordinator did not investigate at that school. she didn't think there was evidence any other student was unsafe. she received the report from the county police investigating another officer. she never opened that report. she did not look at it. that tells you that we need to change our practices. i am very grateful the criminal justice system operated as it should have for the officer. i am deeply distressed the students at that school didn't receive the support they needed from their school to make sure they would be safe and that no other student would suffer what the young woman who reported had suffered at the
school. i say that to you to say, please, make sure you operate a campus that communicates to your student that every student is valued. you expect every student you admit to succeed and that you will be there to make sure all your students can enjoy the educational opportunity that our nation's laws promise to them. i'll stop there. [ applause ] >> kathryn, i want to follow up on something you said. one of the things that we experience is our students come to us, the products of dysfunctional environments. one of the things we discovered on our campus was a tremendous amount of undiagnosed mental illness. right now, don't act like we don't all have undiagnosed mental illness on our campus. the reality is, we don't talk about it.
it's the dirty little secret in the back room. uncle so and so is just a little off. he's not off. right? there are issues there. how would you recommend that the institutions begin to address that the students are coming to them, the products of dysfunctional behavior that is a social but was normalized in their living context. you literally have to teach them a new way of operating. are there any resources that the department offers or any suggestions that you might have? because i have spoken to enough of my colleagues to know this isn't just an isolated incident. >> yeah, i think there are a couple levels to that question. how do we serve the student himself or herself who has the undiagnosed mental illness and needs to be supported. there, our legal requirements
are, if you have reason to know that a student needs accommodation on your campus, you need to evaluate and provide it. so, where there is a student who seems a little off or who is indicating a need for help, you need to be sure that that student has access to information about how to ask for it, who to go to to seek the help and where your administrators and faculty have reason to know that they are able to reach out and offer resources to that student as well. so, there is the how do we serve the student who is effectively asking for help. then there is, what do we do on campuses and assimilate the students who come to us from their home lives, from their k-12 experience that is precede their time on campus? the reality is that, racially hostile environments don't begin at 18. sexually violent environments don't begin at 18.
these are kinds of experiences that students live before coming and sometimes experience once they get there. we need to make sure that we are communicating to our students at day one, before they come and every day when they are at school. the environment that we want them to thrive in at the school. so, we encourage a statement of values, we encourage active communication about who you are and what's acceptable on your campuses and active encouragement of sharing thriving differences of viewpoints so people can express their ideas, share thoughts and learn in the campus about how to interact with others in a respectful way, in a way that is appropriate on that campus. we, in the department of education recently released a set of tool kit and a set of guidance for k-12 schools about sexual violence and about ways to ensure our students are learning before they get to college, appropriate ways to interact with each other and appropriate ways to be a good bystander and stand-up for
students who need it and ways for schools to focus on trauma informed learning to respond to the whole person who recollects students are as they get there. we are trying to address the issue before your students come to you and we also strongly encourage you to recognize you will have an influx of new students every year. you will have a changed campus climate every year. you need to be, every year, throughout the year, responsive to who you have on your campuses and how to make sure those students can succeed. >> thank you. i would like to add that it would be helpful from the department's perspective if there were some resources that could help the institutions engage in more preventive measures or in-depth opportunities for training for the students on the way in that would allow us to be more successful. we appreciate all the support you guys give in that area. all right. we'll let you off the hot seat now.
curtis? >> yes, sir. >> you are the president of hbcu. you have a tough job, a very, very tough job. i know that you are working across the institutions to do this, but maybe you can share with us some of the trends that you are seeing in terms of community policing externally and community policing internally that really help to alleviate some of the issues that we have seen outside of our campuses. >> fair question. so, over the last couple years, i'll start with externally first. >> thank you. >> community relations is a huge deal for us at this particular junction. if we started a few years back, i would say that community issues was not at the forefront. the conversations we would have, marijuana was. as the laws across the country
changed where you have states where students come to campuses from states that were -- where it was legal, they find themselves in a trick bag. they are thinking what i did at home, i can do here. subsequently, that involves them being charged with misdemeanor charges. if i have a student spending $160,000 a year or for a four-year degree and goes to get a job, they can't because they have a misdemeanor on their record. a lot of chiefs around the country where i work, arkansas baptist college have a program in place where i work with the city and local judges and district judges to seal in clear records once they have paid their debt to society. then, two, we need the folks to work and be able to be employable to get jobs. that's part of an initiative from a community standpoint as far as outreach. the thing you are seeing before
we started the talk about relationships in communities, with these guns on campus and gun violence on campus. i get a call every time we have an incident, thank you very much for the phone calls. curtis, what happened on this particular campus? i can tell you over the last five years or so where you would see incidents where students would have b.b. guns on campus or something of that nature. over the last five years i can tell you now where we may have found one real gun, i will find ten guns a year. i'm not only seeing guns, i'm seeing sawed off shotguns, ak 47s and some campuses where we have the campus safety personnel, you have an unarmed security officer on the campus where i have ak-47s being used. they are an extreme disadvantage and cause a problem. the risk factor goes up.
so, the deal now is, how do we mitigate those issues not only from an internal perspective, but external perspective. we have taken a forward approach, if you will, approximately two years ago. i reached out to kathryn and her team. i ask her as it relates to issues pertaining to sexual assault. it's a mixed bag when we look at a student who may have been accused of sexual assault. they are going to be adjudicated on your campus. they are going to go through a law enforcement process as well. then you have a title ix investigation that is going to occur. the thing that is on the backside of it from a point of perspective is all three of those investigations can be discoverable and brought to trial to use against a perpetrator. now, how does due process work into that process, especially if it's a case of foul.
now you have a slanderous opportunity for somebody that may not have done it. we have those cases and god forbid, very sensitive to the issues where we have a sexual assault and we have to move forward. what happens when a victim who was not sexual aassaulted screams rape? how does that affect a student in his career as he or she moves forward? we have it with different genders. we have to be cautious with that. two years ago, we started with a focus group discussion in atlanta with the national center for campus public safety that centered around bringing campus safety chiefs to the table to start having these conversations. from that particular movement, along with some of the other things from community policing standpoint, we moved into vermont doing a conference. 2015 to discuss the issue we are dealing with with community policing.
those are some of the things we have to get better at dealing with. the campuses have to get better from the perspective of what do we need to ensure our campuses are safe? now, let's be real here. we are a call center for most campuses when you talk about police departments and security departments. we are not a profit generating area. i can guarantee you the first time you have a shooting on your campus and your enrollment dips 200 students within the first 48 hours, you understand we need to invest in our campus safety process. when you have to field those phone calls from parents and i can tell you, september 27, 2012, when i lost someone to black on black violence, i had a young man that walked up with a glock handgun with an extended clip in his pistol and shot the young man three times. now, to respond to that and i'm looking at my campus thinking,
we look like virginia tech. i have emergency responders. i'm one of the first people on the scene. you have to forgive us. most of your chiefs, when we get that information, we don't care how we are dressed when we get there. i had sweat pants on, flip-flops and a bullet proof vest and my gun trying to get to my student who is bleeding out on the sidewalk. i'm thinking, one, i have to secure my campus so they are safe. two, how am i going to pick up the phone and call his mom and dad? if you have never had to pick up a phone and call a parent to basically tell them you lost their child while they are in your care, god bless you. for those of you who had the opportunity, you understand where i'm at with that. the 109 schools that we have that make up the hbcu family, i see approximately 60% of our campuses on an annual basis, i don't see the other 40%.
the only way we get better was in 1999, we had a handful of great men and women who decided we need to be able to talk about those issues that are central to hbcu families. this organization was created. very small and very humble in the beginning. to where we are today which we are growing. we have a voice on the national platform now that basically we are trying to make sure the vision is there. we want to make sure we are actively involved. >> let me say, being at a school in texas where our legislature felt that it was okay to pass campus carry because, you know, the english professor is so threatening to the student body, it is particularly concerning because you don't want your campuses to turn into the wild,
wild west. you don't want it to be a shootout. so, it is comforting to know the work that you all are doing. but it is an issue that there is no perfect answer. >> oh, absolutely not. to the point of conceal carry, i had a young man decide, he came to campus. great student, conceal carry student. realized he had his weapon on the side and decided i need to place it in a secure location until my performance is done. he stuck it in what he perceived his friend's backpack. come back after the performance and checks the backpack where he thought he put the gun. the gun is gone. another student realized, oh, my god, i have a gun in my bag. as opposed to bringing it to campus safety, she thought it was cute to buy two ounces of marijuana and a couple hundred dollars for a gun. she sells it to a local drug dealer.
i say that to say this, my perception on guns is, as many laptops, cell phones, things of that nature that come up missing on a daily basis, what makes people think their guns won't come up missing on your campus? if we are saying we are institutions of higher learning, what role in higher learning does guns play on a college campus? it does not. [ applause ] >> one word, also, to finalize, there will be, i want to say in 13, 14 and 15th of november, will be an open carry form at mckinney, a college in mckinney. i can't think of the name of the school right now, but i'll get that out to the body, on conceal carry or open carry. we have the opportunity. southern baton rouge, johnson & johnson will participate in that.
we have another college chief who will also be participating in that forum to have these discussions to help shape policy as we move forward with this critical issue. >> thank you very much for your work on that. we are going to take this in a slightly lighter direction. nancy? >> no pressure. >> right. right. how are you doing today? >> i'm doing well, thank you. >> great. i know that nrj is making significant investment in research around school safety and justice. in addition to that, your work on the 21st century policing. perhaps you would like to share with the audience what your research is finding. >> right. so we have, for years actually, invested in policing science and have done so, i think, incrementally given, again, different focuses that have come
to our attention. but, it was certainly the president's task force in 21st century policing that elevated this and really identified the six pillars where research recommendations were proposed and, of course, encouraged the academy to be responsive to them. so, last year, we went ahead and released a solicitation and direct response to that report and happy to say that we were able to support over $6 million of research in this space. the projects that we supported, i think are so vital today. for example, we funded one particular study that the researcher at howard university will be looking at civilian oversight and the impact that they are actually having on accountability and department of
justice interventions, which, again, you might think we have evidence on how doj interventions, collaborative reform efforts once again in place and we don't. so, one of the things that i really want to make sure that we convey as scientists is that the evidence base in the area of policing is rather thin. that means that, unfortunately, we often don't have the guidance to provide to local, state of tribal criminal justice systems on how to proceed and what policies or practices are in the best interest of their communities. i can't under state that enough. that is significant. because for us, it means that we are often left conveying to the
field that we are investing in these areas and hope to be able to provide that information. so, for example, the infusion of technology within law enforcement, we have supported body worn cameras and ensure that obviously police officers have the tools they need and it is a tool. yet, we, today, are unable to convey the impact that this is going to have on police departments, on use of force, on strengthening relationships with communities. so, yes, the tool is out there. yet the impact is still unknown. i can talk about safety and wellness and obviously, we, too, recognize that keeping communities safe is our primary objective. at the same time, we have officer involved shooting
incidences that are rather complex. which means we need to think about not just the actual individual, maybe who is hurt or wounded, but the diad that exists. that research does not exist. again, when i say the evidence base is rather thin, i hope you see the need to continue to invest in these important areas. we also, of course, through our comprehensive school safety initiative and partnerships with federal agencies like the office against women and cdc are ensuring that we identify and create that evidence base to ensure our k-12 and colleges and university campuses are safe. $75 million each year since 2014 goes directly to research in this area.
this research is hoping to bring together not only criminologists, educators, law enforcement, behavioral health specialists to identify the comprehensive strategies to ensure we can prevent violence. and rerecognize that early childhood trauma plays a significant role in the pathways of individuals who enter your institutions or those who, unfortunately, don't have that opportunity. so, we have invested in a longitudinal study that is going to be tracking individuals from high school into college campuses, or not. past, obviously, four years, to get a better sense of how this early childhood trauma that we see, of course, can only be compounded, given other
stressors in life, how it shapes individuals trajectories. so, when i think of what you can do for us, as a science agency within the department of justice, i would encourage you to have your faculty and students reach out to criminal justice agencies in your communities and offer your support and expertise. i travel throughout the country and visit jails. i visit local police departments, i visit prisons. when i ask, what do you need from the community? what do you need from the scientific community and the academy, i am told repeatedly, regardless of which setting, they wish they had partners to help them understand the capacity of their data and inform policies and practices. they need that. they want that. i hope you take that challenge and encourage and find ways to bridge with these local, state and federal justice agencies.
i also hope that you become aware of the many opportunities we have to support research in this space. we have opportunities for individuals interested in all disciplines. you know, if you care about the criminal justice system, there is room for you. we have a table outside, which i hope you stop by and see. but, nij.gov will provide you with the many, many programs we have for graduate students, for young scholars and early career investigators who have never been through the grantsmanship process. that's what i kept hearing from young faculty who said i can't compete with the x. i can't compete with, you know, the bigger institutes. i can't compete with my mentor for funding. so, i created a specific solicitation to support them in their endeavors. we have graduate fellowship opportunities in the areas of social and behavioral sciences
as well as in s.t.e.a.m. please, please, please, if you can do anything to encourage, again, future scientists and help me create that pipeline, i hope you encourage them to think about their role and how they can serve our criminal justice system in the way we try to do so every day in the department. so, thank you again for this opportunity. >> thank you very much. [ applause ] >> calvin, i know earlier you spoke a lot about the work you guys are doing. maybe you could tell us some of the opportunities that there are for partnerships between your agency and the hbcu community. >> thank you. i have to jump on nancy's question a little bit to add more to that, if you don't mind.
one of the things that's missing. i not only work at the fbi, but i work at the cops office, community orients police services. the universities in this room, some of the largest programs you have on your campus are criminal justice programs. in those criminal justice programs, i have to say, not enough hbcus are bringing enough ideas to the table about the way to do things better. a lot of white institutions are bringing those ideas to the table but they don't include the community that really needs and really is the out, really is the ones who need that information. if you have -- when you have criminal justice programs, i think part of the responsibility of that criminal justice program is put ideas on the table and push those ideas to the federal level to be funded to change the way things are done. what nancy is saying is people are putting ideas on the table, but they aren't working.
when we go to study them, they don't get studied fully because they don't work. if you have ideas that work, you should be bringing them to the table and adding them to the discretionary process so they can be funded. there are not enough ideas in the process for us to be able to fund. the fbi is all about partnerships. we are ready for partnerships at all times. we have over 100 campus liaisons. they are visiting the campuses talking about the things we offer and the things we are able to help you with. as i said earlier, we want to help and be reactive and overcome instances and things like that that may happen on the campus that are traumatizing from active shooters to earthquakes, whatever it may be. we are there to help with that, with evidence, emergency operations, all those things we are able to do. we also want to help
proactively, we want to be able to help you to be able to understand how you can work through these problems beforehand. we want to be there reactive, but we would rather be there before these things happen. the honest part, again, is a lot of your agencies, a lot of the law enforcement people on your campus, as curtis was saying, a lot of them are unengaged with us. we are reaching out to them because some of them are contractors, because some because some of them have so many jobs that they're doing on campus on top of security. a lot of them are not reaching back to us to be able to fulfill and to be able to have these partnerships that you need so bad. so i would go home if i were you and say to your chief, to your campus security person, what is the nature of our partnership with our federal and local partners? get the answer to that question and it really will help you understand where your campus is. i'm not just talking about the
ability of a fire department to come. but if something was to happen on campus, what is the nature of the response that would happen on your campus to be able to really help you? at some point, everybody becomes overwhelmed. there's no incident that's happened whether it's virginia tech or k through 12 or wherever that whoever was the initial responding people were overwhelmed there. there are resources available from fema, from other people after that to help with the additional things needed to get you back to where you want to be. you don't want to miss any days of school because in the end, you have residential students that have nowhere else to go. so every day you don't have class is a day they don't know what to do. we want to get to a point where you're getting back to what the new normal is for your agency and your organization. so you need help to be able to do that. we need you to partner with us to reach out to us as we're reaching out to you.
>> thank you very much. so we are just about out of time, but what i would like to do is give everyone maybe 30 seconds to make a closing statement, and then that way we won't be egregiously over our time limit. so we'll start this way and come on back in. >> you start with me? i just finished talking. we're available to partner. we're open to partnership. and, like i said, our agents are responsible for partnering with every single campus in the united states, whether you are a white serving institution or black or hispan -- whatever it may be. they're supposed to be there helping you in this. it's up to everyone in this room to hold us accountable like we're holding you accountable. your students are holding you accountable. >> thank you. >> i certainly want to again just hope you visit nij.gov and see the many resources we have available on -- not only on, obviously, our historical
investments but also our strategic plans. we have released various strategic plans in key areas. safety, health and wellness, one. we'll be releasing our strategic plan, and this is a five-year research plan for the next five years, which has been shared with omb which means my department, my agency will be beholden to making these investments which we think is important. i certainly also would hope that you reach out. i will stick around and be available to answer any questions you may have. because if you are unsure about how to connect with your local state criminal justice agencies or maybe there are faculty or initiatives at your institutions that you think certainly may be ideally fit for our awareness, please let me know. we certainly want to be as informed as possible on an array of issues that you're addressing facing our criminal justice system. again, thank you so much for the
invitation. it's been a pleasure. until next time, always. >> thank you, nancy. real quick on my end. those institutions who have not been actively involved at the hbcu level, i really need to see your chiefs, your security directors. i want to thank publicly the federal agencies because we shifted probably four or five years ago to having the federal agencies provide our training at the national level. and the fbi, of course, has done a wonderful job. catherine and her team at ocr have done a tremendous job. and the list goes on. there's two people. i don't see jacque batiste. he made it a personal mission of his to basically ensure that we had access to the things at the national level we needed to have. so my hat is off to jay. the last thing i'll leave you
with is continue to pray for hbcu as we lay to rest leroy who lost his life last wednesday and departed on to glory. and we're going to go down and love on his family and make sure he gets a good home going. thank you very much for your help. >> thank you, curtis. catherine? >> my thanks also and really want to emphasize that you are such unbelievably strong leaders on your campus. your students are looking to you. please be the change. set the tone that you want for your campus. make sure you're communicating to your students that you support them, that you will be there for them, and this is the campus they deserve and it is of their dreams. i really appreciate the leadership that you engage in every day. i hope that you set a tone that you don't wait for a next moment of horror that brings on gaffes of the type we heard today but that instead, you are campuses that you'd want your own children to thrive in.
>> thank you. so we would like to say thank you to the audience for joining us this morning. can we have a great round of applause for our panelists. [ applause ] and in closing, let me say this. this is the issue of our time. you heard the attorney general tell us this. i think inherently, we all know this. what comes into conflict at times is how we respond to the issue of our time. this is not the moment for there to be separation between our students and ourselves. we must work collectively. working collectively means we must give their issues the audience which it deserves. it won't be comfortable. they will say things that may make you feel uneasy and may not tap into the truest, best version of yourself. you must fight that because this is one that we can win together but we cannot win apart. and we need to work together. this is our moment. we cannot stand on the sidelines
and watch rome burn. we will be judged negatively from an historical perspective if we do not get this right. so i would encourage all of us to sit down with our students, listen and give audience to their pain and their concern and find a way to work together. on behalf of -- i guess you're here to do the on behalf stuff, all right? oh, i was headed to church. let me just say thank you, and i am now going to give way to the executive director so that i will be invited back. [ applause ] >> how about pastor sorrell, family. amen. amen. amen. thank you all so much. again to the panel and to michael. we appreciate the conversation. as he has encouraged us, these
are our issues to lean into to be a part of the student conversations. our attorney general certainly was amazing this morning. we're fired up about this conversation. we hope you'll continue the conversations as you move to the breakout sessions right now. and then we will convene back here for our legacy luncheon to hear from the wonderful mark muriel from the urban league who will bring a powerful message to us as well. please continue to enjoy the conference. thank you. >> that concludes this morning's special education and justice conversation. continue the coverage on twitter, #hbcu --
tonight on american history tv, victory and concession speeches from three past presidential campaigns. beginning at 8:00, the 1980 election. president jimmy carter's concession speech and ronald reagan's victory speech. at 8:30, george h.w. bush and ross perot. then george bush and al gore's speeches. also the 1979 debate over the official title for george washington and subsequent leaders of the u.s. all of this tonight on american history tv on c-span3.