Month: February 2021

The Future of Workplace Productivity: Smart Desks and UltraSharp Monitors

first_imgOur customers operate in highly competitive industries where every ounce of productivity is critical. Creating a workplace that empowers employees to be productive continues to be one of the key priorities for organizations of all types. While we know there are many factors that influence a workplace’s productivity, customers keep listing one issue as a priority—ensuring their employees are provided with the right tools to ensure their success.There are many considerations that need to be taken into account to ensure employees are in the best environment for productivity. Employees need maximized ergonomics to foster efficiency as well as increased interactivity to take advantage of the entire workforce’s capabilities, while organizations need to be able to customize work styles to tailor to different roles.At Dell we pride ourselves on knowing our customers and we work closely with them to stay ahead of the technology demands and business challenges that drive their decisions. It’s this understanding that fuels our innovation, along with a desire to provide truly connected offerings that incorporate every aspect of our industry leading hardware and software suites. Although incremental adjustments continue to be made to keep existing products competitive, Dell often leaps ahead of the curve to provide customers with the next big thing.At Dell World in Austin today, I was extremely excited to preview two industry-leading concepts that will provide our customers with added productivity and performance opportunities they have only dreamed about—the Dell UltraSharp 27 Monitor and the Dell “smart desk” workspace of the future concept.Unprecedented detail and compatibility to power the most creative conceptsOur groundbreaking Dell UltraSharp 27 Monitor is the world’s first display with Ultra HD 5K resolution (5120 x 2880) and it will be a game-changer for our creative customers. Dell serves the creative technology needs of Academy Award-winning visual effects studios, Grammy-winning artists and the next generation of students who will find entirely new ways to leverage technology in fields such as arts and design. These customers’ day-to-day jobs often depend on productivity, efficiency and precise detail to bring complex, out-of-this-world images to life in the face of rigid deadlines. Ultra HD 5K resolution is equivalent to four times the resolution of QHD and seven times the resolution of Full HD. That means photographers and our other creative customers who require the highest possible resolution to optimize their work will be able to see and control the imagery they invent at an unprecedented level of detail.By integrating the Dell UltraSharp 27 Monitor with Dell’s entire innovative solution suite, we can offer our customers true end-to-end solutions—always backed with Dell’s reputation for security, manageability and reliability. This monitor has six USB ports and one media card reader to seamlessly connect to a litany of other peripherals; whatever our customers need to push the limits of their creative concepts will be supported. The Dell UltraSharp 27 Monitor is available for pre-order on November 13, ready to ship on December 18.The future of office collaborationOnce an innovation proves to greatly increase productivity, it becomes hard to remember how we got by using the antiquated method it replaced: rolodexes vs digital contacts, faxes vs emails, typewriters vs computers. The Dell smart desk concept revolutionizes the productivity of digital artists, engineers, architects, and scientific analysts operating high-performance professional applications with the introduction of a new interactive zone. By combining interactive LCD touch screens with innovations in user experience in conjunction with participation from key Independent Software Vendor (ISV) partners and Dell Precision workstation performance, the Dell smart desk will change creative, design, and analysis workflows for the better while introducing a new, immersive way for professionals to interact with those demanding workflows.Placed in a natural, horizontal location that is more comfortable for touch interaction, the Dell smart desk provides a work surface that aligns closely with the productivity requirements of professional software applications. Paired with a vertical display and powered by plug-ins to key ISV applications, it instinctually separates seeing from doing; primary work activities are close to the user while secondary ones are further away, accessible through intuitive screen swipes. Ten-plus finger touch functions as well as high-performance pen functions and gestures, paired with a new generation of tools naturally located on a horizontal surface, will allow users to easily manipulate digital content without having to step away from the task at hand. This digital desktop allows for multiple desks to be clustered around specific projects, notes that can be searched and shared, and better organization with scaling and stacking of growing content. In fact, any smart desk workspace will allow users to pick up their work session wherever they left it—regardless of location. This means that the creative cycle will never miss a beat.These conceptual product previews are not only industry leading, they have the potential to transform the way we work. They represent an exciting shift in how we interact with technology in the workplace. However, most importantly, they are innovations that were not created in a vacuum. Not only do the Dell smart desk and the Dell UltraSharp 27 Monitor feature the latest technological developments, they also address productivity issues of real importance for every customer.It’s this mix of innovation and practicality that really excites us at Dell – and it’s a notion that reflects the entrepreneurial spirit Dell was founded on more than 30 years ago.&nbsp;</p><p>last_img read more

New SCv3000 Array Lights up the SC Series Entry Category

first_img1 – Based on Dell EMC internal estimates.  Actual customer price may vary based on a variety of individual circumstances.2 – Performance results based on internal tests performed by Dell EMC, April – June, 2017.  Actual performance will vary based on configuration, usage and manufacturing variability. It’s a lot of technology to cram into a small package – and it’s precisely the combination of capabilities our small- to mid-size customers have been requesting to help them compete with larger rivals in their respective industries.Highly extensible on-rampMost importantly, SCv3000 gives these budget-conscious IT managers a truly future-proof introduction to advanced storage. Where else will you find a customer-installable array with starting street price under $10K and all these capabilities?Robust VMware, Microsoft and other 3rd-party integrationsEver-expanding Dell EMC ecosystem support (Networker, Data Domain, VPLEX, PowerPath, etc.)Seamless federation/replication with a family of larger midrange systemsThe bottom line is buying confidence. SCv3000 customers are empowered to move forward quickly, trusting a proven architecture, backed by Dell EMC’s world-class innovation and customer care.  As needs change, they can move up in the SC Series portfolio at any time while preserving and extending their initial investment. Stellar upgrade optionsAnd believe me, there’s still plenty of room to move up! Fast as the SCv3000 is, the adjacent SC5020, launched in May, offers twice the real-world performance (>250,000 read/write IOPS or >370,000 max reads), not to mention Intelligent Deduplication.  And it just gets better from there, with SC9000 now pegging max speeds well over 400,000 IOPS.2Whether you step up to that extra performance right away, or integrate it later with included multi-array federation – you’ll be able to move your workloads among arrays quickly without interrupting them.  The point is, whatever your starting point, SC Series has a right-sized, right-priced and right-performance solution to meet today’s needs, plus evolve wherever your business takes you.https://www.youtube.com/watch?v=xVJWjmKhI7gStay tuned for more good things from SC. Today’s announcement won’t be the last you hear regarding this dynamic, interconnected portfolio. But if you’ve been waiting to get on board because of feature limitations at the affordable entry-level, now is definitely the time to let your best apps shine on the SCv3000.Learn more about SCv3000 Series Storage Arrays:SC Hybrid Arrays Spec SheetSC Series Animated Video Overview This week we’re announcing our new Dell EMC SCv3000 Series arrays, the latest offering in the popular SC lineup. Designed to provide unprecedented customer advantages at the “most affordable” end of midrange storage, this release signals an increasingly bright future for the overall SC Series, already poised for significant share gain in an expanding market.With modern auto-tiering, hybrid flash configuration options, intelligent data reduction and enterprise-class management, SCv3000 is a low-priced powerhouse. It showcases Dell EMC’s uncanny ability to make leading technology practical and attainable for more customers.Complete portfolio refreshSCv3000 is the final step in an upgrade process that began with the award-winning SC9000 array.  Following that release, SC7020 and SC5020 also offered quantum leaps within their respective categories.  SCv3000 now completes the transformation, making SC Series a more heterogeneous family, with unified software capabilities across a diversified hardware platform.SCv3000 goes beyond the previous-generation entry product (SCv2000), upgrading it with key capabilities of the larger SC arrays.  Performance has ramped 50 percent with faster processors and 2X more memory, and additional SC Operating System features have been enabled, includingData Progression – now build auto-tiering, “zero-100 percent flash” hybrid solutions!Intelligent Compression – dramatic cost savings on both SSDs and HDDsMulti-array featuresFull replication with SC9000, SC7020, SC5020 and SC4020Federated clusters (Live Migrate)Native auto-failover/auto-repair (Live Volume)Cross-platform replication/management with PS Series (EqualLogic) arrayslast_img read more

Is It Time To Revisit Your Architectural Strategy?

first_imgLike many industries, it’s easy for financial services companies to get complacent and fall into lulls. We’ve all been there, you get into a process, you build out the process, you get comfortable with it and you don’t generally question the process.But we’re living in a new world order when it comes to security, risk, hacks and breaches, spanning cyberterrorism, identity fraud, nation states and the like – which all bring significant and dire consequences for financial services organizations and their customers.Data has become a commodity – and the reality is that data has a monetary value. Personal and identity-related data has an even higher monetary value. And financial services, whether that’s fintech, banking, credit reporting or others, all possess the highest-grade data available, putting the value associated with that data that much higher.What boggles my mind is that in this new world order of increasingly sophisticated threats, coupled with the rising value of data, why aren’t more financial services institutions making security a priority, and more so a continual priority since day one? Why aren’t they being more vigilant?The answer is threefold and it comes down to three big vices plaguing the financial services and the broader business communities:InertiaThe reality is that big companies generally don’t move very fast. There’s a tendency not to change things unless they’re broken, and that applies to everything from corporate policies to IT infrastructure. It can be a challenge to rationalize an investment in something that appears to be working well, whether it’s poorly architected or not.However, as the headlines have shown in recent months, it’s paramount that financial services organizations examine their authentication strategies, their encryption strategies, and their architectural strategies. This involves also putting good “cyber hygiene” strategies into play such as applying security patches and doing the due diligence to ensure architectures minimize risks with data at rest encryption, among others protective measures.No doubt it’s difficult to try and unwind a systemic culture of inertia, but getting continued investment for systems that just appear healthy may not be the greatest option longer term.HubrisNo one believes their company will be compromised, despite the overwhelming odds that almost everyone over time will be compromised. Just like the children of Lake Wobegon that are “greater than average,” many companies believe with confidence that their architecture and security will beat the odds. This is all the more so with large companies with strong track records of success. Again – as recent headlines show – security is not a “fix it and forget it” endeavor.NaivetéPeople in large companies too often want to bury their heads in the sand when it comes to security and risk and think “this couldn’t possible happen to me.”So where does the responsibility fall when it comes to security? Is it with the CSO? IT? The general manager? It’s really all of the above, in a true security crisis it’s useless to point fingers. Sure, the CSO is often the executive that takes responsibility, but they can’t be expected to defend everything and their budget isn’t limitless.For emerging financial services institutions, many of which may not have a strong security background, it’s a matter of engaging in a dialogue about what’s truly at risk when storing their customers’ personal information. This includes the holistic architecture that has been constructed and its long-term viability in an increasingly dynamic industry.Consider the three vices discussed above and have a frank and honest evaluation of whether your financial services organization might be guilty of any of them. Is there a good cyber strategy in place? Are new security patches being downloaded and installed? If massive corporations and credit reporting agencies such as…..well, you know them by now in the headlines….are being hacked and crippled by cybercriminals, what’s to stop your organization from being the next victim?  If you don’t want your analytics, trading, or cloud platforms left without solid security, we welcome a deeper discussion around how not to be a #cybersecurity target.last_img read more

A 360 View of the IoT Landscape Three Years Later

first_imgThe general IoT environment has transformed since I joined the Dell Technologies OEM & IoT team over three years ago. While many of my colleagues focus on specific vertical industries like manufacturing, telco or marine, my role is broader. As a result, I bring a different perspective – the ability to take a 360 view of what’s happening across multiple industries.So, three years later, what new developments am I seeing? Are there common trends? Is one vertical leading the way? How is IoT developing from an Edge and infrastructural perspective?1.The Rise of Edge & On-Prem ComputingFirst things first-I believe that the role of edge computing has radically transformed. Three years ago, edge devices were there to collect and aggregate the data before analysis and storage in the Cloud. Fast forward to today. Now, computing and real-time analytics are increasingly being managed at the Edge.Why this transformation? There are three main factors – cost, latency and security. Take the case of the autonomous vehicle, the ultimate IoT device. It has to make split-second decisions and adjustments, based on information received from an array of sophisticated sensors, responding real-time to pedestrians, traffic, road signs plus potential hazards like detours and accidents. If you’re the driver of the oncoming car or crossing the road with your kids, would you really want the data to go to the Cloud first for analysis? The same holds true for a production line in a factory. You cannot afford for the line to be down while you wait around for analysis to return from the Cloud. Hence, the inevitable rise of Edge and on-prem computing.Of course, the Cloud will continue to be an important element in the overall IoT picture but it’s not going to be the much touted all-singing, all-dancing solution. Increasingly, I see a hybrid model emerge. In this new-look world, edge analytics and on-prem computing will do the heavy lifting with key outputs sent to the Cloud for visualization and backup. In this way, I see the Edge to the Cloud operating together in harmony as part of a continuum.2.It’s A KOTs WorldOur mantra has always been COTs, namely, we use commercial, off-the-shelf compute building blocks, based on open standards that can be configured, customized and pre-qualified. That core principle has not changed.However, our IoT partners provide very valuable resources and services. To enable them to do their magic, they need a variation of this model – what one of my colleagues, Jeff Van Horn, aptly calls “KoTS”, “kinda-off-the-shelf” commercial building blocks. I would characterize KOTs as the best of both worlds and one that is particularly suited to the IoT landscape.3.Leveraging Existing Infrastructure Through Open StandardsOf course, many of our customers, particularly in manufacturing, already have intelligent and high cost compute environments in place. What I find interesting is how they’re now becoming increasingly savvy in leveraging and reinvigorating that investment.In the past, these customers would have been locked into paying annual licensing fees to vendors for additional services, like predictive maintenance on a rolling production line. Now, using a standard rather than a vendor-specific compute device, customers are in the driving seat, controlling what they want to add and how they go about it. An open-standards edge device also delivers flexibility in terms of what can be connected. Again, I am increasingly seeing a hybrid approach, where customers are connecting different PLC controllers into a central gateway device.The good news is that whether you have existing Honeywell, Siemens, Alan-Bradley or Schneider specialist hardware, you don’t have to rip and replace $2 million worth of manufacturing equipment. Even better, you’re no longer tied to proprietary standards and worried about how it will all connect and work together. Thanks to edge computing and open platforms from companies like Dell Technologies OEM & IoT, customers can integrate all the elements together and future proof their investment.4.Key IndustriesIt’s no surprise that manufacturing and industrial automation industries, like mining, continue to lead the way in IoT. These industries have a long and proud tradition in sensor technology and automation; IoT is simply the next logical step on a progressive journey.However, we’re also seeing machine builders increasingly embrace IoT. What do I mean by machine builders? Effectively, companies that provide their customers with an integrated turnkey solution, consisting of our hardware and their IP. Take the spectrometry world, where you have specialist lab instruments, packaged with proprietary software, sitting on either a Dell Technologies’ edge device or an appliance server, with the software and hardware working together to measure the chemical, physical and biological components of liquids like blood, beer or wine. (Quick aside, for those of you who are wine connoisseurs, the good news is that this will help the vineyard scientifically determine the optimum time to pick the grapes but more on this later).5.Industry Collaboration is The GlueOf course, challenges remain, and as a leader in the industry, it’s important that we help make the IoT journey easier. Back in 2017, we helped launch the EdgeX Foundry, an open source project within the Linux Foundation. This was all about developing an open framework for interoperability between IoT devices and applications. The project has seen a steady increase in the number of backing organizations and developers contributing across the globe, in addition to our own ongoing inputs. This has certainly helped provide solutions to many of the open IoT questions.Maybe you run marathons to keep fit or relax? Well, if you compare an IoT project to a decent run, the EdgeX Foundry will bring you within sight of the finishing line, delivering all the core building blocks you need. However, if the project is complex, you may need a bit of a leg-up for the last few miles. Exactly why in 2017, we invested over $1 billion in IoT research and have provided funding to new companies like Nexiot that specialize in smart sensors, big data algorithms and ultra-low-power embedded technology.In summary, the big trends I see are edge computing, KOTs, open standards, flexibility and industry collaboration. While there have been huge advances over the last three years, I believe that the best is yet to come. A picture paints a thousand words so watch this space for a series of follow up blogs focused on interesting customer IoT use cases.What are your views on the IoT market? Have you noticed any additional trends? I would love to hear your views and would be delighted to answer any questions! Do join the conversation.Come meet us at Olympia London on Thursday 25th April at IoT Tech Expo Global: https://www.iottechexpo.com/global/Join me at IoT Tech Expo Global for the following sessions on the 25th:12.30-12.50: IoT Edge Computing Managing your data14.30 -15.10: Panel Predictive Maintenance – How to Unlock True ‘Actionable Insight’To learn more about Dell Technologies OEM & IoT Solutions, visit: www.dellemc.com/oemKeep in touch. Follow us on Twitter @DellEMCOEM, and join our LinkedIn OEM Showcase page here.last_img read more

Bakers Half Dozen – Episode 7

first_imgEpisode 7 Show Notes:Introduction with Matt BakerItem 1 – Data network effects are (mostly) BS. Data is rarely a good strategy for defensibility.Andreessen HorowitzItem 2 – Goldman Sachs and Data MoatsGoldman SachsItem 3 – Just because data is frozen, doesn’t mean it’s hard to retrieve.Item 4 – 7 Rs of the application landscapeCitrixItem 5 – Public cloud fight and disruption ZDNetItem 6 – What should we focus on for AI systems?Item 6.5 – Don’t build cathedrals when stick frame homes will do!CloseDisagree, agree, or just chat with Matt using #BakersHalfDozenlast_img

Not Just Another G: What Users Want

first_imgThis is the third installment in our series Not Just Another G, which provides insight into 5G and what it means to the service provider industry so they can help the end users achieve what they want. Missed the first two posts? Catch up here.User experiences and workloads are driving the next generation of mobile computing. They have a direct impact on the evolution of wireless architecture and compute infrastructure to best meet the connectivity, latency and compute processing needs. 5G is a direct manifestation of what users want and Dell Technologies is helping service providers build their network to achieve this end goal.Users want on-the-go access to their data and applications. They want access to work, family, social connections, entertainment, sports, communities, health data…literally everything and they want to access it right away from anywhere. To deliver optimal user experiences, some application use cases need higher throughput (e.g. content delivery), while others need lower and predictable latency (e.g. AR/VR, gaming, telehealth). Users want access to new applications and increasing amounts of data, while keeping the costs roughly the same as they are used to paying today.This requires telecommunication companies and other connectivity providers to look at innovative ways to meet these demands. Modernization of their network infrastructure is key, as is finding ways to monetize the user behavior and bits. New, vertical-focused applications are emerging to make use of the higher-performance 5G networks. Many of these use cases are mission critical (e.g. healthcare, finance, Intelligent connected vehicles, industrial IoT).This emerging connected world thrives on an ever-increasing need for speed of access (low-latency), and ever-increasing access bandwidth (throughput). Capacity revolutions in silicon transistors, magnetic and now silicon-based media, and the available radio spectrum that began in the second half of the last century, have continued unabated in the 21st century and show no signs of slowing down. Like a universe that is expanding and accelerating at the same time, both the network access latency and throughput are improving at the same time!5G mobile access technology is the latest manifestation of this trend. 5G promises to deliver download speeds of 10Gbps—1000x faster than 4G, enabling an entire HD film to be downloaded in under 10 seconds. 5G also promises to deliver latency less than 1ms–50x better than 4G latency of 50 milliseconds. But let’s be clear, these super-fast speeds go well beyond faster internet. It’s truly Not Just Another G.As network speeds increase, the amount of data that is produced and needs to be processed is also increasing. 5G is enabling this massive amount of data  produced by devices and sensors to be delivered to the edge locations or edge clouds where it can be analyzed. Similarly, 5G enables an increasing amount of media and gaming content to be delivered to users wirelessly.Dell Technologies excels at bringing to market infrastructure hardware and software on which many of these services will be hosted and optimized. Critical technologies such as software-driven network optimization and scaling, novel memory and storage technologies such as persistent memory, and lightweight container software that has the security characteristics of the virtual machine are being created by our global engineering labs. We can’t wait to share this exciting future with you!Thanks for tuning in to our Not Just Another G blog series. Stay tuned for a variety of 5G use cases highlighting the lifesaving capabilities as well as an exploration of our technology solutions from the edge to the core to the cloud. The possibilities are endless!last_img read more

It’s Time for a Multi-Cloud Approach that Works for Health IT

first_imgSupporting multiple cloud providers is now a requirement in healthcare. Health IT organizations are being asked to manage cloud-based solutions ranging from SaaS-based applications to collocated equipment at a service provider. A recent survey found 35 percent of healthcare organizations have more than half of their data or infrastructure in the cloud. In fact, there are several reasons to consider off-site cloud providers as the destination for some of your healthcare IT needs.The typical drivers for using cloud-based products include cost, performance and security. Other considerations are access to capital, staffing and regulatory needs. In some cases, healthcare organizations are not looking to make any further IT investments in data centers and have developed a “cloud first” mentality, but this may lead to an oversimplified view of a more complex challenge.Just as every organization must have its own specific business model, each healthcare provider’s data strategy must be driven by the needs of its specific clinical and business workloads – not the other way around. Cloud strategies are ever-evolving rather than a simple one-off solution. For example, on-premises solutions are better for risk reduction and monitoring, while off-premises solutions may offer better manageability, ease of procurement and cost. For this reason, every healthcare provider must determine its priorities to optimize a cloud strategy that matches the best location for their data and applications.A multi-cloud infrastructure offers the ability to identify and monitor information across the entire healthcare data ecosystem through a single pane of glass, simplifying intelligence at the point of care and collaboration among clinicians. This strategy provides a consistent operating model and simplified management across private clouds, public clouds and edge locations, along with the flexibility to adapt to future changes in health IT.Common Control Plane for Multi-CloudDell Technologies cloud-based solutions have been designed to deliver flexibility and choice – to help cut through the chaos. Our solutions offer access, not just to the hyperscalers (e.g., Amazon AWS, Microsoft Azure, Google GCP), but also to hundreds of VMware Cloud Providers (VCPP). Your private cloud runs on our best-in-breed Dell EMC infrastructure, and the VMware Cloud Foundation further controls the environment so you can seamlessly move workloads among any public cloud provider.To fully leverage a multi-cloud environment, your organization needs to contain the cloud sprawl to effectively manage the entire application portfolio. We have seen many organizations move to the public cloud without a long-term plan in place, ultimately forcing them to bring workloads back in-house to control cost, performance and security. We recommend a pragmatic approach, assessing applications and workloads for the best landing zone. With a common control plane management interface, your organization can visualize, evaluate costs and control risks associated with the computing environment between multiple cloud providers.Steps to Building your Multi-Cloud Environment1. Modernize your in-house infrastructure – this includes choosing best-in-class hardware and software with the most resilient and efficient architecture to build upon. This architecture can be a traditional three tier architecture (separate server, network, storage platforms), engineered converged infrastructure (CI) or hyper-converged (HCI) architectures that have an appliance like design. When you move from traditional three tier to converged to hyper-converged designs, management is simplified, and total operating costs are lowered. HCI offers single button upgrading and patching as opposed to individually patching all the disparate components. Typically, 20%–50% of patches for software bugs can introduce new, unknown problems. The goal of modernizing IT infrastructure is to not only lower costs but provide the most resilient and performant platform to run critical applications.2. Virtualize the environment – Most health IT operations have virtualized applications or workloads today — in fact 93% of hospitals are already using VMware products. To set the foundation for cloud-like operations, increasing automation and instrumenting the environment is necessary. Applications should be surveyed and qualified for the ability to be cloud-enabled. Assessing the cloud readiness of workloads will provide the basis to determine the optimum path. IT should aim to create a self-service portal where your internal customers can perform basic needs and IT administrators can see how workloads perform. The more automation is placed into an IT environment the more it allows IT operations to be further streamlined. 3. Transform your operating model with automation – Workload placement optimization should be based on criteria of cost, performance and security incorporating application discovery. You should discuss whether a workload is best to stay in-house, move to a private cloud, or public cloud. Using the VMware Cloud Foundation (VCF), can help you place the correct workload in the best destination based on your defined criteria.Learn MoreDell Technologies can help your healthcare organization remove barriers to simplify your multi-cloud adoption. Visit the Dell Technologies booth #2121 at HIMSS 2020 to speak with our subject matter experts to learn more.UPDATE 03/05/2020: Since the publication of this blog, the HIMSS 2020 conference in Orlando has been canceled. If you’d like to learn more about our solutions, please visit the Dell Technologies Healthcare page.last_img read more

EXPLAINER: Executive orders can be swift but fleeting

first_imgWASHINGTON (AP) — President Joe Biden arrived at the White House ready to wield his pen to dismantle Donald Trump’s legacy and begin pushing his own priorities. The new president has already signed dozens of executive orders targeting foundational policies of the last administration. Biden’s goals include reversing Trump’s ban on travelers from several predominantly Muslim countries, calling on the U.S. to rejoin the Paris climate accord and stopping construction of Trump’s border wall. Both Trump and former President Barack Obama relied on executive orders and other presidential directives to get some of their most controversial policies around a deadlocked Congress. But the governing tool often comes with fleeting impact.last_img