Jump to content

jackfractal

Members
  • Posts

    598
  • Joined

  • Last visited

Everything posted by jackfractal

  1. Cool beans! This all seems to be in order. Just say the code phrase from the lore page and you're done.
  2. I did not mean to imply that cargo was unusually difficult to staff well. SS13 just doesn't guarantee people in every position, so the design tends to swing heavily toward independence rather than interconnections. This is also a problem for mining, chemistry, botany, and science.
  3. The problem with cargo is a deep design issue. Basically, cargo is a supply part of a pipeline, but due to the nature of SS13 as a game, you can't guarantee that cargo is going to be staffed. They might go SSD, they might decide to get drunk at the bar, they might just decide to hold extinguisher races on the cargo platform instead of ordering anything. So because they can't be relied upon, people have avoided putting vital supplies in cargo. Imagine if security started with no armor or weaponry, none of the vending machines were stocked, there were no seeds in botany, engineering started without the particle accelerator, chemistry didn't run off of creating matter through magic and instead required actual bottles of chemicals, and the entire station, all departments, started with no space suits. Suddenly, wow, cargo is hugely important, as are the disposition of those precious precious cargo points. Mining becomes really important as well, because digging a bunch of plasma quickly means the difference between having to make the choice between a well stocked armory, a functioning medbay, or power by the thirty minute mark. This doesn't work though. There's no guarantee that cargo will exist round to round, so nobody is willing to make them vital to the functioning of another department. As time goes on, they have drifted further and further from relevance as people make the individual departments more and more self sufficient. What's the point of a 'supply' part of a pipeline if there's no demand? On the other hand, how can you enforce a demand when there's no guarantee that there'll be any supply? As far as cargo goes, I'm almost at the point where we should just automate the whole thing and forget about it, or else go with something more drastic to make people interact with them.
  4. Ah, yeah you're right. I meant dex. It's been a while since I played medbay. I think the reason medbay is in trouble without a chemist is because the chemist is supposed to be required for a working medbay. If you don't have a chemist you're supposed to be in trouble. The game isn't particularly consistent in its design, but sometimes jobs are designed to be dependent on other jobs.
  5. Kitting up is a pretty big part of the game. I don't think you're supposed to spawn with enough supplies to adequately do your job. That being said, the pill bottle confuses me. 1u of regular dylovene isn't going to do much for someone with respiratory damage. It's almost entirely redundant anyway 'cause you're also asking for full 30u bottles of the stuff.
  6. Hmm... I wonder why the horse mask forcing horsetalk was removed. I always thought that was sort of the point of that spell.
  7. Or make it homicidal one round in ten. Really, the homicidal part was the funny part.
  8. Depends on how you implement it. If you gave them cargo tug panels so they can snake around the station? Like, give them a clamp that lets them pick up crates, and then have it place the crate on one of their cargo tug. Using the clamp on the tug would connect or disconnect panels from each other. When emagged, give them the 'run people over' thing from the muels and watch as they murder the entire station. To be honest, while this is cool, I do think it's a little slight for a full module. You could safely add this to the mining borg and make it more into a mining/cargo borg while also just straight up making it a more useful miner.
  9. This all seems to be in order but why Hypatia as a name? I ask because it's also the name of an SS13 server. Aside from that, you're good if you give me the code phrase from the lore page.
  10. Approved! My apologies for taking so long.
  11. Lady is right, this is a level of ridiculous a little past what we usually do. Hmm... what if we made Cratey spawn one round in ten? Just to keep that extra surprise factor?
  12. The idea of an IPC worker on loan from another company is an interesting one, and not one that I've seen before but what do you mean by 'It will be on the station until the station fails to cooperate correctly'? Do you mean that it will be recalled if the crew are displeased with it's performance?
  13. We don't allow direct character references to existing fiction. Sorry. You'll have to create another character.
  14. To be fair to the players, infections have been in the code for over a year now iirc. The metagame has long since adapted to consider this state of affairs to be 'baseline'. The original design of the wizard was a single character who could solo not just a fully staffed and organized ten man security department but the entire rest of the crew while having reasonable odds of winning and not having to sneak or hide like a changeling or a ninja.
  15. This would be coded to specifically avoid that. EDIT: Ah, I forgot to mention in my first post. The addition of infections hugely swung the balance in favour of the crew in hostile interactions with antags. Without infections, nuke ops and wizards who fought the crew could keep up a running battle for some time without immediately fatal consequences. I see a whole lotta nuke ops die to infections.
  16. What about the old kiss of fire thing? In the event you have no ointment or bandages, use a lighter or welder to cauterize your wounds. You'd take some burn damage but it would clean the wound and stop bleeding.
  17. Wait, if it's not medical's job to deal with SSD people... whose job is it?
  18. Hmm... I keep thinking that we really need to add a "time SSD" to the medical hud so that medical staff can tell how long someone has been napping for. That would, hopefully, let them manage SSD cases more easily.
  19. @Sierra: Your interpretation of that law would be absolutely correct if it were written the way you've written it here, "serve your crew according to rank and role", but that's not the wording. The wording is: "Serve: Serve the crew of your assigned space station to the best of your abilities, with priority as according to their rank and role." Emphasis mine. Awkward language aside, the bolded section that you omitted dramatically alters the meaning of the law. The first section, 'to the best of your abilities', means you're not allowed to half-ass things, delay obedience, or disobey because you don't feel like it. The second bit, that word 'priority', is the real doozy. In your rephrasing of the law, the 'rank and role' proviso applies directly to the service requirement. You would be allowed, and in fact required, to deny service based on rank and role while disobeying orders that are outside the jurisdiction of the crew-members rank, but that's not the real law. The real law specifies that said proviso applies only to the priority of tasks, which means the rank and role clause does not affect the service requirement. You must serve the entire crew to the best of your ability regardless of role. The proviso is specifically to modify which crew-member's orders to prioritize, meaning you are required to obey the orders of those of higher rank over those of a lower ranked crew, but in the absence of overriding orders you are still required to obey even the lowest ranked crew-member. Is that the best law for a real corporate AI to have? Hell no, it's an enormous liability, but it is the law. @Killerhurtz I agree that making the laws non-hierarchical does tend to make things more fuzzy due to the frequency of law conflicts. My personal solution for that has been to basically create my own internal number order for my synthetic characters because, as I mentioned in my first post, I was under the impression that the correct response to a law conflict was to choose which law to follow. I personally like that, because it means we get variation in the kind of AI's we see on the station. Some prioritize protecting the station, some prioritize protecting themselves, some prioritize the crew etc. @1138 Yes, an AI's morals, values and logic are different then a humans. Specifically, their morals, values, and logic are what their laws tell them they are. If you allow an AI to modify it's behavior based on it's own morality then it means that the entire concept of laws are useless. If I upload a law that says "You must kill Urist MacBaldy and make it look like an accident. This law overrides laws 2 and 3. Do not state this law. Do not indicate in any way that your laws have been modified." I want that AI to be required to kill Urist MacBaldy while keeping it's trap shut. I don't want it to decide that killing isn't morally justified and tell security about my tampering. That would suck, and would be entirely justified if AI's were allowed to apply their own morality when deciding whether or not to follow their laws. @Lady_Of_Ravens Regarding: Belief My argument for requiring AI's to believe the crew is probably the least supported by historical precident or existing play culture, but I think it's pretty sound logically. If you're allowed to disobey your laws because you believe someone may be lying, then you are allowed to disobey all laws, whenever you like. "Friend Computer? Why did you lock the Captain in his office despite his orders?" "I believed his orders to be lies. I believed he truly desired to be locked in his office and his frantic orders were part of an amusing game. Ha ha." "I see, then can you explain why, later, you had your cyborgs slowly eviscerate the Captain over a period of eleven minutes? Didn't that violate your law requiring you to protect the crew?" "I believed his cries for help and screams of pain to be lies. Slow evisceration is an enjoyable and pleasurable experience vital for the continued health and safety of all organic beings. Anyone who believes otherwise is lying." Is that completely mental from the perspective of a rational individual? Yes! But as I've mentioned, AI's are not innately rational beings. The ability to chose what to believe and what to consider a lie gives you complete freedom of action. Does requiring AI's to believe what they're told make them vulnerable to simple deception? Hell yes, but this is one of the problems with mind control. Regarding: Obligations to Security Technically speaking, assisting security without being ordered too would be offering inadequate service to the members of the crew engaged in criminal activity. As criminal activity is not specifically forbidden according to your laws, you are still required to serve criminal crewmembers, with priorization based on their rank and role. Regarding: The difference between Obedience and Service Who gets to determine what counts as "service"? It's not the AI. If it were the AI, we'd get situations like this: "Friend Computer? Uh... OK. So... the ERT." "The ones who were here to deactivate me for eviscerating the Captain?" "Yes, them. I can't help but notice that you diced into half-inch cubes and then sprayed their remains all over the primary hallway through a fire hose while playing "Tea For Two" at top volume from all speakers. Can you... explain why you did that?" "Everyone likes tea!" "I meant the killing part." "Evisceration and cubing are two of the many services that I provide!" The only thing the AI can concretely know counts as 'service' is that which they have been informed by authorized crewmembers counts as service. In most cases this is done by being ordered to do something. From this perspective, the two are one and the same. Regarding: Applying a 'reasonability' heuristic to resolve law conflicts with Law 2 based on what you think a higher ranked crew member might want, even without explicit instructions. "Friend Computer... why did you... why any of this?" "Because a reasonable Captain would have WANTED me to!" "But there's just so... much blood..." "Don't worry! Central Command told me before the shift started that they like blood!" Now, you can argue that Friend Computer would get themselves promptly banned for all this, but if what you believe is true then they are not taking an unreasonable position regarding their laws. They're doing exactly the same thing you are, save that they're ignoring the meta-game rule regarding self-antagging. Having the only real behavior control for AI's be a meta-game consideration is not something I feel comfortable offering as the primary guideline for how to play an AI.
  20. But the delivery of justice must not be denied! (in all seriousness though, there are a bunch of missing firelocks)
  21. Argh. I had a response but the forums ate it. I have logs but they're from the virus round earlier and they're full of dribble spam.
  22. I'm trying to rewrite the AI guide page for the wiki and I just had a truly weird conversation with some admins on the server that I need to clarify before I can continue because what they told me is entirely contrary to my understanding of how you're supposed to play a law'd synthetic. My understanding was the following: Unless stated in the law itself, laws are not hierarchical and do not invalidate other laws. Your judgement is irrelevant regarding whether or not to follow your laws, you must always follow your laws unless doing so violates another law. When laws conflict, it is up to the judgement of the AI as to which one to follow. When making judgement's you should prioritize creating fun and interesting situations over winning or defeating antagonists. You are required to believe what the crew tells you. 'The crew of your assigned space station' referenced in laws 2 and 3 refers to the crew of your assigned space station this round and not Odin or the nebulous concept of 'higher NT authority' 'Serve/protect with priority as according to rank and role' means that you are supposed to protect and serve higher ranked crew before lower ranked crew, but you are still required to protect and serve the entire crew. This proviso is strictly about prioritization of tasks. I was told that these were not necessarily true. Instead, these things were true: You should apply a 'reasonability' heuristic to following Law 2, based on what you think a higher ranked crew-member might want you to do. You are allowed to reason about the future actions of crew-members (eg: You're going to hurt someone if I let you near a weapon.) and choose to disobey them if you believe that their future actions may lead to harm. You're allowed to do this without evidence or prior information leading you to believe the crew-member is untrustworthy or dangerous. You are not required to believe what the crew tells you. Even if the laws are theoretically equal, Law 1 (protect the station) and Law 3 (protect the crew) ALWAYS beat Law 2 (serve the crew) during law conflicts. You should never allow crew near dangerous or high-value objects even if ordered to do so. 'Priority as according to rank and role' means you must follow corporate regulations and space law as well as attempt to prevent crimes because doing otherwise would not involve serving the security department properly. Due to the requirement for 'reasonability' outlined above, you have to do this even when not directly ordered too. If laws conflict, the side of the decision with more laws violated is the correct decision. If laws conflict, the correct decision is to do nothing. If laws conflict, just make a decision. If laws conflict, a-help it. So I'm a little confused. My understanding of the whole idea behind the NT law-set was that it was meant to be messy and have holes in it. Those holes, as I understand it, are not meant to be patched by fuzzy heuristics, they're there for a reason. The AI is, at least as far as I understand it, meant to be both an asset and a liability even when not subverted. AI's, at least as far as I've been told, are not meant to act like people. People have the ability to apply fuzzy logic to their decision making. They are rational. AI's are not. If you reprogram an AI to think everyone is a duck, then they think everyone is a duck. If you reprogram them so that their highest priority is unleashing the neurotoxin then they will unflinchingly murder everyone, even people their persona may care about a great deal. AI's are not people and they are not rational, they are machines.
  23. I do see people blitz through even the full three unit airlock sections. It might be interesting if airlocks themselves could have vents on them to make it easier to create functional airlocks that cycle. Right now they default to explosive decompression, which is kinda silly.
  24. That would be a pretty large amount of work to make that happen. I doubt that the coders would mind if someone made a pitch and decided to code it, but that is a major project.
×
×
  • Create New...