I use Keynote to make my presentations, and one time I wanted to build a presentation with someone else. I asked my friend who has worked at Apple for 20 years, "How do you guys build Keynote presentations together? There doesn't seem to be an easy way to do that?".
He said, "We don't collaborate at Apple because of the (perceived) risk of leaks. None of our tools are built for collaboration". Apple is famously closed about information sharing, to the point where on some floors every office requires its own badge, and sometimes even the cabinets within.
So it doesn't surprise me that their video editing tools are designed for a single user at a time.
Edit: This happened about six years ago, they have since added some collaboration tools, however it's more about the attitude at Apple in general and why their own tools lag on collaboration.
Edit 2: After the replies I thought I was going crazy. I actually checked my message history and found the discussion. I knew this happened pre-COVID, but it was actually in 2013, 12 years ago. I didn't think it was that long ago.
I've been working at Apple for almost 12 years. While secrecy is indeed paramount, once a tool is internally blessed, we collaborate normally using it. Keynote collaboration is actually pretty standard nowadays.
Opinions are my own and do not reflect those of my employer.
Didn't the article say some floors require keys for different offices and sometimes filing cabinets.
That implies every floor is different which matches what you are saying.
Most of the stories that have come out felt like they were the image Apple wanted to give. It started with Apple going after missing iphone that was left at a bar. We've heard those working on latest design for the next iphone were sequestered away from the rest of the company. I've always thought it was marketing spin and I'm glad we have an ex-apple employee confirming this. Back in the 'Lisa' days Apple did split and silo divisions, Apple did closely guard new iPhone designs with very few leaks happening but the rest of the mythology is more marketing.
I think you are drawing the wrong conclusions here. I have never worked at Apple, but know many people that have. Every team is different but the overarching theme is that they are very secretive internally, especially around hardware. They are so secretive that someone I know was working on a project that their own manager wasn’t allowed to know about.
Was it like a temporary assignment to another team? Did the manager at least know what team that was? Or have any idea when the employee is going to return full-time to the tasks of their primary team?
Apple uses functional organizational structure. Every product needs a cooperation of all functions to produce results. So engineer on os team working on drivers could be working on driver for the new hw part, but other team members including their manager are not necessarily disclosed on that hw.
> Not to mention that blanket statements about Apple are absurd
It isn't absurd as what GP mentions was imported into Amazon by Dave Limp, a former Apple C-suite. It was a terrible culture shock for most of the ICs in my team being reorg'd reporting in to Limp, after Steve Kessel (of Kindle fame), the previous leader, went on a sabbatical.
Anything Apple gets attention. But any large organization does various forms of segmentation. Many of these stories are “true”, but also bullshit.
I worked for a company that did some work for the federal government. Boring stuff. Their compliance rules essentially required that we firewall the folks with operational access to their data from the rest of the company. We included the physical offices in that to avoid certain expenses and controls companywide.
The company claiming something you said, even out of context, could be interpreted as coming from the company. If you choose to disclose you work for a company, you become a spokesperson for that company unless you disclaim those words (even then, there are other considerations to make regardless of whose opinion is being expressed, because you linked yourself to the company.).
By putting that, they decrease the likelihood of reprocussion in the workplace for things said outside of the workplace.
You can still get in hot water for anything you say that ties back to you or the company regardless if you disclose who your employer is.
This is the grey-area that corporations typically carve out in a social-media policy so that employees can engage in discussions around their employer without being on behalf of their employer.
It's still a perilous position to put yourself in as an employee. Innocent and innocuous things can always be misunderstood or misinterpreted.
What happens when you use that disclaimer and are self-employed though?
That's a weird answer, Keynote can shares presentations, and multiple people can work on the same presentation in real-time, either on the macOS/iOS or the web version.
The feature has been available for years: https://support.apple.com/en-us/guide/keynote/tan4e89e275c/m...
The collaboration features were introduced in 2013 on the web version, and in 2016 on the native versions. And maybe check which features are actually not available before dismissing it.
The documentation always refers to the current versions of the software, and the latest version of iWork always requires being on latest or near-latest OS. Collaboration also requires all clients to be on the latest version of the software.
Well the issue is that stories from “friends of friends” tend to get super unreliable very fast. Unless someone is coming on the record as having been employed themselves, stories from friends are almost always a lot of BS.
They were BSing you or working in a different part of the company than SWE.
Back in the day Keynote files would just be passed around via a shared server so you and the people you were collaborating with could make and merge changes between them, eg I’d do one part of a presentation, Rick would do another part, and we’d copy our slides out of and paste them into each others’ decks to get a complete version for rehearsing with. If we had notes for each other, we’d give each other the notes out of hand rather than just directly change each others’ slides.
There’s a lot of mythology that people just make up about how secrecy works at Apple. It’s mostly sensible.
>Apple is famously closed about information sharing, to the point where on some floors every office requires its own badge, and sometimes even the cabinets within.
“Severance” is exactly how Apple’s New Product Security and Public Relations organizations would like all employees to be, to an absolute T. However, the rest of the company is much more pragmatic and understands well the value of collaboration and employees having enriched lives that they share with the workplace, since that leads to greater innovation and works well as a recruiting tool as well.
> We don't collaborate at Apple because of the (perceived) risk of leaks.
That sentence, by itself, is more or less correct (from my 26 years at Apple). However, it suggests/implies things that are not correct.
1) In case you got the impression: Apple certainly does not design software to be non-collaborative simply because it would enable sharing/leaking when used within Apple. I would say that Apple has been focused since Day 1 on a mindset where one-computer equals one-user. The mindset was that way really until Jobs was fired, discovered UNIX, and then returned with Log In and Permissions. To this day though I think collaboration is often an afterthought.
So too do they seem to be focused on the singular creative. I suspect Google's push into Web-based (and collaborative) productivity apps (Google Docs, etc.) forced Apple's hand in that department — forced Apple to push collaborative features in their productivity suite.
2) Of course Apple collaborates internally. But to be sure it is based on need-to-know. No one on the hardware team is going to give an open preso in an Apple lunchroom on their hardware roadmap. But you can bet there are private meetings with leads from the Kernel Team on that very roadmap.
That internal secrecy, where engineers from different teams could no longer just hang out in the cafeteria and chat about what they were working on went away when Jobs came back. It probably goes without saying it was rigorously enforced when the iPhone was a twinkle in Apple's eye.
The internal secrecy was sold to employees as preserving the "surprise and delight" when a product is finally unveiled but at the same time, as Apple moved to the top of the S&P500, there were a lot of outsiders that very definitely wanted to know Apple's plans.
3) Lastly, yes, plenty of floors and wings of buildings are accessible only with those with the correct badge permissions. I could not, for example, as an engineer badge in to the Design floor.
Individual cabinets needing badge access? I have no idea about that. I am aware of employees hanging black curtains in their office windows when secret hardware would come out of their (key-locked) drawers. (On a floor that is locked down to only those disclosed, obviously the black curtains become unnecessary.)
This matches my experience. In addition I was advised/strongly encouraged to "go dark" on social media and refrain from ever discussing work at lunch, even with teammates.
My badge only worked where I had explicitly been given access, and desks were to be kept clear and all prototypes or hardware had to be locked in drawers and/or covered with black cloths. Almost every door was a blind door with a second door inside, so that if the outer one opened, it was not possible to see into the inner space.
Both are designed to replicate the same functionality as Concurrence and Quantrix (itself a clone of Lotus Improv) both by Lighthouse Design, who made lots of apps for NeXTSTEP and were purchased by Sun.
Steve Jobs used Concurrence on a ThinkPad and also a Toshiba laptop to make presentations prior to Keynote (which I believe was created internally for him at first) even while back at Apple.
>So it doesn't surprise me that their video editing tools are designed for a single user at a time.
The editors of Severance are actually using Avid. For music composition they're using Albeton. Neither are Apple products. The remote desktop product they're using is Jump Desktop.
While the show is an Apple TV+ show, and they happen to use to Macs in the process, this has shockingly little to do with Apple tools or products.
I seem to recall an anecdote from a colleague that interviewed with one of Apple's security teams. The actual room where the interview took place was locked from the outside and you had to use a badge reader on the inside to leave. I guess they didn't want folks wandering if someone needed to make a restroom break, but I can't help but wonder about issues like, say, a fire...
One wonders how well that is tested, as well as what happens if a fire goes detected, or if someone's badge stops working, or if there are technical difficulties with the badge reader or its infrastructure...
There are far too many things that can go wrong with such a setup.
I have personally worked in buildings that had a "badge readers all stopped working for a while" problem. Fortunately, the badge readers only affected ingress and not egress, and only controlled exterior doors and labs; that's easily solved with a doorstop and a person checking badges. I can very easily imagine what could have happened in those buildings if a badge was required to leave a conference room.
And if you want to make that scenario terrifying, imagine being there on a weekend or holiday.
> I can very easily imagine what could have happened in those buildings if a badge was required to leave a conference room.
The facilities team and fire marshal are also easily capable of imagining this, already have, and you can ask them about it.
In this case the doors would fail open, or are made of glass and can be broken down. It's not a /really/ secure location. It's just a tech company that likes to seem secure during work hours. After hours of course the janitors get to see everything.
You are utterly missing the point, to the point that you are analyzing this conversation through entirely the incorrect lens, in an effort to belittle.
In an effort to steelman your comment, you may have incorrectly interpreted the earlier "I wonder how well that is tested" as "this is unsafe and illegal" rather than "among the many things wrong with this, this has increased the number of things that can go wrong, and is less safe on an absolute scale, whether or not it's strictly legal and up to code", and then assumed everything else in subsequent comments was about fire safety, rather than being a series of points in support of locking people into a building is a bad idea.
You are asserting the competence of the fire marshal, as an argument in a conversation about locking employees and interviewees and visitors inside a company's office rooms.
What you may think was happening here: "heh, nerds think they're smarter than the fire marshal and nobody involved thought of this until they came along; of course there'd be a way for sufficiently capable humans to get out of a room if something went wrong, and of course this will have been made to pass fire code, which is the only thing being talked about here".
What was actually happening here: While with sufficient analysis (which has most likely been done) it is possible to provide a sufficient degree of fire safety to make it not against fire code to lock people into a building, that doesn't make it right or zero-cost or risk-free, nor does it alleviate the stress and potential problematic-but-non-fatal situations that could arise. At no point was the primary purpose of the comment "people might burn in a fire", even though the risk of that is not zero at any time and has likely been raised (within presumably-acceptable-to-fire-code levels) by such a setup.
When I said "I can very easily imagine what could have happened", I was not imagining a fire burning down the people with the building inside. I was imagining how few failures it would require to end up with people being trapped in a room for long enough to reach the level of stress required to physically break out of a room, compounded by having worked in labs where the air conditioning was sometimes woefully insufficient.
It takes a lot of stress to get normal people to the point that they're willing to break windows or doors or walls in order to escape a room, and nobody should be subjected to such things, because there's zero security justification for a company locking people inside at any time.
Grenfell Tower was fireproof as originally designed. The problem was renovations that compromised the original design, by adding highly flammable cladding panels to the exterior that allowed the fire to spread easily around the entire building.
More importantly, the HQ is built in California, which despite appearances isn't a capitalist dystopia but a local government dystopia.
Any random local government staffer is the most powerful person in the universe and obeying them is a religious edict. Apple has zero power to disobey anything in the fire code and they're probably not even capable of imagining doing so. That's why the random suburb they're in has the best public schools in the country and all the houses are like $5 million.
As an example there's currently a big empty lot next to said HQ where the mall used to be, because a random woman on the city council has blocked apartment construction for the last decade, because she thinks Apple employees will move in and molest local high school students.
Power failure is a best case. I've observed firsthand cases of "badge access system went down, none of the doors open". That's less of a problem for external doors that allow people out but not in, because it can be solved by propping the door and posting a guard who checks badges. It's a massive problem when conference rooms and offices lock people in.
there is also the earthquake issue where interior doors (badge access or no) can become jammed. thus god invented the crowbar. my Big Company emergency response team folks all had one. also good for head crabs.
A few times in my life I really had to get through a locked door and asked myself "What would Kojak do?" and always got through with at most three kicks.
>but it was actually in 2013, 12 years ago. I didn't think it was that long ago.
I know some people will say this is because of age. But I want to suggest I often thought of COVID years 2019 to 2023s as a single year / event. For reasons I cant quite fathom. So when I think of 2015 it would only be like a 2023-2019, 2018, 2017, 2016. So around 4-5 years ago.
Obviously a huge bias here (I work for Figma), but it’s one of my favorite things about Figma Slides. The product still has a ways to go, but man being able to actually be collaborative and not feel like you’re fighting against the software is a game changer.
Video is a harder game due to the processing and data requirements, but I know that there are a lot of startups trying to make it collaborative first. I’m really excited for that to be the default.
A quick note on this for non-editing folks: in context, a "proxy" here is a low-res version of your actual footage. It's common to use them while editing a cut together, and then to replace them with the full-res versions at the very end.
I worked at Apple over a decade ago and no idea what OP is talking about.
There is plenty of collaboration in the company but it's typically constrained to the current project you're working on. And working in enterprise companies today it is no different.
Apple is famously a company that encourages cross functional collaboration, as anyone who’s ever interviewed there could attest to, or known more than your friend. They’re secretive yes, but also collaborative.
You can even read any accounts of famous shipped products to back up that cross functional collaboration has been their culture for many decades. Jobs mentioned it many times, and many articles have been written about it.
Additionally keynote (and the entire iWorks suite) has had collaborative editing for years now.
I suspect your friend is likely misinformed or not reliable?
I wouldn't be surprised if their attitude toward remote collaboration probably changed pretty significantly around 5 years ago. But fair enough that it may not yet be a primary consideration in all of their software.
Except "uncle that works at Nintendo" is a meme because Nintendo of America is a small business doesn't develop almost anything, whereas "friend that works at Apple" is less unreasonable for an American tech worker forum to purport.
Isn't it also true that Apple have dozens of different scm / developer platforms scattered around the company? e.g. some teams use gitlab, others phabricator etc etc
I think so as I just saw this on their jobs website:
> We are seeking an experienced Software Architect specialized in source control systems to join our dynamic team. The ideal candidate will have expertise in designing, implementing, and managing systems like GitHub, GitLab, Perforce, Bitbucket, and Artifactory.
Huh. At my last company, probably less so presentation collaboration (in my case, less though still some if I were co-presenting) but shared documents with editors and so forth were huge. Better built-in workflows would have been nice ut it worked well enough with a bit of discipline, e.g. once you do a handoff you (mostly) don't make further changes unless you noting a typo or something.
It was quite common to have remote desktop cards on high end machines so that you could hide them away somewhere quiet. The edit stations/Flame/Baselite machines all hada fucktonne of 15k sas drives in them, so were really noisy.
You couldn't invite a director to see what you were doing, when all you can hear is disk/fan whine.
They were quite expensive because they needed to be able to encode and send 2k video in decent bitdepth (ie not 420, but 444), and low latency. Worse still they needed to be calibrateable so that you could make sure that the colour you saw was the colour on the other end.
Alas, I can't remember what they are called, thankfully, because they are twats to manage.
This is a pretty common problem with all true workstation level computer systems. It's like taking a rack from a data center and putting it in your office. You've got a dozen or more spindles and fans spinning. I've seen systems with $200,000 worth of RAM in them, but that was back when 256 GB of RAM was $100k. And, yeah, they had 15k SAS drives. If you think servers are expensive, you've not priced workstations.
Every time I've seen higher end workstations, the actual workstation itself was always in a separate room, and there's been some kind of remote KVM solution used. The workstation was always very noisy and generated a lot of heat. It's also just... a lot of money to shove under a desk where people kick it all afternoon.
I do I.T. for a small broadcast studio (it's actually a sports venue, but they have their own production and broadcast studio), and it is indeed still very much like this. We have rack-mounted workstations alongside all the servers and networking, with KVMs to the next room where the production is handled. This was all spec'd and built out in 2019.
Not exactly sure how that compares, but I bought one of these quite hopefully: https://www.apc.com/us/en/product-range/203414049-netshelter... and "soundproof" means my home office doesn't sound like sitting on a subway train, and more like the inside of an airport.
Indeed. But the equipment inside has fans and must move air around. The enclosure is most effective at high frequency noise and so what's left is much more tolerable (and reduced). I suppose you'd need to go to liquid cooled to do better? Or are you aware of a more effective enclosure?
I'm afraid of trying something really invasive like liquid because I live in Portugal and getting weird stuff takes forever.
> Or are you aware of a more effective enclosure?
No I don't really: I have some heavy blankets hanging which helps a bit. My musician friend told me I should glue some egg cartons to the blankets, so I'm going to try that soon.
Probably Miranda. Brings back a lot of memories from the flint/flame/inferno days. I remember buying a tezro for ~150k USD in 2005/6. We also were "gifted" an Inferno around that time which I heard originally cost multiple hundreds of thousands. When it showed up it was the size of a refrigerator and took dual 30A power feeds. Sounded like a jet and didn't last long.
Teradici came on the scene and started running everything over IP. Hardware at first (old EVGA pyramids were everywhere) where you had to route the video out into a custom card that then put out the signal via IP.
Now it's all software with the leaders being teradici (merged with HP anywhere which came from IBM), nicedcv (Amazon), parsec, and a few others.
The big advantage in content production over something like vcn/rdp was color fidelity, local cursor termination, and support for hardware like Wacom tablets. You can even do 7.1 audio and multiple monitors. Turns out when you are an artist having a local like feel is incredibly important. 60fps is 16ms per frame. So even with virtual workstations on AWS you want to deploy them in a region that is relatively close to the end user.
So there are a couple of options, depending on the hardware. If it kicked out HD-SDI you could just patch the display into the coax in the building and have done with it.
But that only worked if you were in the same building and your machine kicked out HD-SDI
Most machines either shat out dual-link DVI or worse, some custom shit. Getting a cable that can reliably transport dual-link DVI >10 meters was difficult and expensive. Worse still, it had a habit of dropping back to single link, or some other failure mode that was everso annoying to debug. More over, 10 meters often isn't far enough. Especially if the room had a projector (so might be >5m long throw.)
Now, thats the simple case. The hard case is multi-building. Say, you have an operator working in london, and the director in new york, you want to give them the highest quality picture possible. The only way to do that at the time was with one of these cards, or some nasty SDI-hardware h264 transcoder (hugely expensive at the time)
I really wish I could remember what they were called. They appear to have fallen out of favour.
Now, you'd just use cynesync, as you're laptop can encode video in real time now (https://www.backlight.co/product/cinesync) Also, rumour has it that the wolverene movie was leaked because a producer got coked up and left an unencypted laptop on a plane, rather than using cynesync to show an edit to someone important. Alas I can't verify that.
I'd love to hear more stories about coked-up producers in the film industry from 15+ years ago. Having done technical codec work adjacent to some of it in the past, it's a wild business to be in.
Based on the comment of 15k spinning drives this must have been quite some time ago, but there's very definite reach length limits on DVI and displayport cables. Let's say this was in 2007 and the maximum state of the art was a dual link DVI 2560x1600 display, you can't extend that in any practical way beyond about 15 feet. Extending USB keyboard and mouse by comparison is trivial. Unless all of the desks and workstations were set up directly on one side of an acoustic barrier wall, a hard problem to solve.
> you can't extend that in any practical way beyond about 15 feet
For passive cables, that makes sense. But with repeaters, wouldn't you be able to go further? Maybe cable repeaters like that are newer than I imagine.
I bought an expensive 10m (30ft) active HDMI cable for connecting my PC to my TV. It said it was UltraHD rated, but could never get it to work reliably beyond 1080p.
So the issue was more that dual-link DVI was very rare, and getting hardware to encode/transmute it reliably and at high bit-depth was almost impossible.
By about 2014 hardware encoders were good enough to send decent quality video over gigabit.
> I bought an expensive 10m (30ft) active HDMI cable
I think what I was referring to are repeaters you put between cables, that amplifies the signal. You connect that device between two HDMI cables (say 5m) + connect it to power for it to actually extend the distance the signal can travel.
I'm not sure what an active HDMI cable would be, maybe circuitry inside the cable that draws power from the HDMI port?
Sounds like hocus pocus to me (like gold plated connectors), maybe you get like half a meter of extended distance or something with those? If I had to I'd go the repeater way (or as others mentioned: fiber, but sounds expensive and not maintenance free)
There are a slew of HDMI extension systems, some that even use ethernet with hardware encoding/decoding. Grandparent commenter hasn't worked in the industry in at least a decade if they're talking about DVI.
These days, if you're just wiring to a single workstation in a nearby next room, 50 meter active optical Thunderbolt 3/4 cables can carry 5K+ DisplayPort video passthrough and data from your USB peripherals.
(It's "passthrough" and not "uncompressed" because DisplayPort may use DSC depending on the resolution and frame rate.)
US$500 for an optical cable can be a lot cheaper than paying for HDMI extender sender and receiver boxes.
In a modern video editing system it's still a non trivial challenge, because you can't just go using any COTS HDMI extension system, which might be good for 2160p30 at 420 color space, or maybe 2160p60 at 420 color space, but may NOT be capable of 2160p60 at 422 or 444 color space. Or may not function for DCI resolution at 4096x2160. Or anything 8K.
There's plenty of HDMI2.0 compliant "video over ethernet cable or fiber things" which are the ordinary COTS products that may not be sufficient for serious video editing needs.
People on video editing workstations these days are using higher end monitors that can be trusted to work in 10bit color and to match a certain color space grading.
On the other hand it's a lot easier these days to have a relatively quiet video editing workstation that has 8 to 16TB of local, pci-express bus attached NVME storage for work space, and that same workstation can have a not-very-expensive 100GbE NIC in it attached to some large/noisy storage elsewhere.
> some that even use ethernet with hardware encoding/decoding
We had those, the problem is that they loose bitdepth. They were also fucking unreliable. We had a lot HDMI extenders and they worked for 1920x1080, and sometimes 2k if you were lucky.
we used them for the "prosumer" LCD projectors we had the in the review rooms. They didn't work so well for the massive christie projectors. (I seem to recall they abused 3g-SDI to get resolution)
Guys c'mon... The desk is set dressed. Nothing in the photos makes any sense. Last of all, Geoffrey Richman isn't doing editing work in Ben Stiller's apartment.
> Geoffrey Richman reviews season two finale footage. In his at-home edit bay (not pictured), he works on iMac, which remotes into a separate Mac mini that runs Avid from a post-production facility in Manhattan’s West Village.
One director had their sofa shipped into the Digital-intermediary room (it had a 2k calibrated digital projector) for 4 months. An artist has got to be comfortable....
for VFX, disney/marvel/fox/sony required that the entire network be air gapped, with really stringent rules on USB, data tracking and interchange. All internet access had to be done via RDP with copy/paste blocked.
Had sony bothered to follow it's own rules it wouldn't have been hacked and had all its data leaked in 2014.....
But to answer the question, we had a shit tonne of networking, so as far as I'm aware it was just on the vanilla network. Might have been a seperate VLAN though.
I studied digital media in uni. But I had been using linux since about 1999.
I wanted to be a compositor, but failed the rotoscoping test at the company I was working at. So I fell back on my technical skills, and became an infra engineer. I left VFX in about 2015, and sadly no matter how much I want to go back, I don't see much of a future in it. GenAI is really going to do a number on it.
> I left VFX in about 2015, and sadly no matter how much I want to go back, I don't see much of a future in it. GenAI is really going to do a number on it.
I don't think Generative AI will make entire industries disappear, but rather make people within those industries do more with less. Seeing as you somewhat see what future of the industry is, and assuming you're right, it puts you in a good position to gain the skills you think will be sought after. You have the technical skills too seemingly. Just an idea, I'm not working in either areas so take it with a bit of salt I suppose.
Interesting, but this misses perhaps the most embarrassing part: They're using Avid and not FCP.
I also don't buy the author's rationale for remote editing; it's oddly archaic: "high-end video production is quite storage-intensive, which is why your favorite YouTuber constantly talks about their editing rigs and network-attached storage. By putting this stuff offsite, they can put all this data on a real server."
Storage is cheap now, and desktop computers are more than powerful enough for any video editing. Any supposed advantage of remote "real servers" is going to be squandered by having to send everything over the Internet. The primary benefit of remote editing (and the much-hyped "camera to cloud") is fast turnaround, which you need for stuff like reality TV and news. But a dramatic series like Severance?
It is pretty baffling that Apple would create a PR vehicle that impugns its products like this. It would be better to say nothing. After Apple acquired Shake, they splashed Lord of the Rings, King Kong, and other major tentpoles on the Apple homepage at every opportunity... of course not mentioning that Weta was rendering those movies on hundreds of Linux servers instead of Macs. But at least Shake was the same product across all platforms, and it really was the primary effects tool on all those movies.
"they do not mention the use of Jump Desktop, which seems like a missed opportunity to promote a small-scale Mac developer. C’mon Apple, do better.)
Oh boy, this is just a minor infraction in Apple's history of disrespect toward developers. They do this, and worse, to major development partners too. I'm not going to name names, but after one such partner funded the acquisition of material on its own equipment and that material was used in a major product keynote... Apple not only neglected to credit or even mention that partner, but proceeded to show the name of a totally uninvolved competitor in its first slide afterward. The level of betrayal there was shocking.
The storage requirements are still massive. I would guess the raw footage for something like severance (and they probably shoot in at least 4k) is going to be in the area of a petabyte for the entire season.
Even today it's not close to practical to have an entire episode's worth of raw footage (of which there'll be many many takes, many many angles) entirely on an editor's workstation.
The surprising aspect is that they don't use proxies for editing rather than remote desktop.
83 terabytes of raw footage for one episode (the S2 finale). This was the longest episode (which doesn't necessarily correspond with footage shot). But for a 10 episode 4K HDR series, 1 PB is in the ballpark for a season.
And yet they still can't stream me a 4k hdr file without noticeable banding all over all of those white hallways. It's funny how this is the highest quality stuff yet has the lowest quality defects at the same time due to fundamental limitations of digital systems.
It is not a "fundamental limitation of digital systems". It is a limitation of streaming services.
If you had more throughout, more bit depth, etc you would have enough colors to not see banding. But the bitrates required (if you insist on 4k) are tough on SSD/HDD IO to say nothing about your network connection. And even if you have the best connection ever, most people don't and streaming services will want the bitrate to be as low as possible as long as the average viewer is not too upset, because delivering higher bitrate costs service real money and most customers don't have the connection for it and don't care about banding
> The surprising aspect is that they don't use proxies for editing rather than remote desktop.
In my experience it is way easier to scale storage bandwidth than compute, atleast locally.
There has been times where I've been able to cut a shoot from the raw files, and this has beeen corroborated by other editors, beforr proxies were available.
So it took less time to cut and submit for review than to actually generate the proxy media.
Sure if your workflow had a decent gap between shooting and post then generating proxies is trivial but sometimes a little more storage and memory bandwidth goes a very long way.
Yes, attaching many terabytes of video is cheap now.
But scrubbing through that high res raw video isnt (just) size intensive. Its throughput intensive. Size : throughput :: energy density : power density. You can get pretty good all SSD NAS but using a 40Gbps (5GBps, minus overhead) Thunderbolt 4 is still gonna be ok but not stellar. A single desktop SSD can triple that!
I can fully see the desire to remote stream. Being able to AV1 on the fly encode to your local editing station, or even 265, at reduced quality, while still having the full bit depth available for editing sounds divine.
You're saying Thunderbolt 4 is going to struggle with something, and then touting a desktop SSD as "tripling" TB 4 throughput... but finally declaring that "remote streaming" is somehow better than both of those?
> I also don't buy the author's rationale for remote editing; it's oddly archaic
> Any supposed advantage of remote "real servers" is going to be squandered by having to send everything over the Internet
Remote streaming is far better. A 2mbps or 20mbps connection to a powerful editing station is awesome. A compressed down h.265 with HDR will still let you edit very well, but be able to do intensive editing tasks with ease.
This really isn't hard at all, the advantages & wins are amazing, remote desktops have been amazing for decades now. I struggle to see how you continue to justify being so far up a creek, other than exhibiting pathology.
Again, you are contradicting yourself and haven't been able to cite all of these "amazing wins." You're claiming that you're going to struggle with scrubbing over TB 4... and pushing remote editing instead! That's laughable.
Also I don't think you understand compression. Interframe-compressed codecs like H.265 are a bigger computational pain in the ass than ProRes (for example).
And "remote desktops have been amazing for decades..." What? Irrelevant. In the '90s people were still buying heavily optimized turnkey systems with SCSI arrays just to be able to capture and edit SD video at broadcast quality; and you couldn't even stream VHS (6-hour mode) quality over the Internet. Come on, man. Why shill so hard for your pet workflow, and berate other people who don't want or need it?
I can scrub my 4K video just fine over Thunderbolt 2. Maybe you need to defrag, bro!
I really struggle to understand where you are coming from, don't see what reef you've so clearly beached yourself on. To resolutely not get it.
> scrubbing over TB 4... and pushing remote editing instead! That's laughable.
You seem incapable of grasping the basic premise of what desktop streaming is. A modern video card will give you a pretty good quality 10-bit 4:2:2 (or 4:4:4 or 4:2:0) hardware accelerated h.265 hevc
& AV1 capable encoder, that is just sitting there for use & which will consume no other resources; for all intensive purposes free.
You connect to your render workstations desktop & scrub there. On its many GBps SSD array.
Even better, instead of buying everyone on the team their own high end desktop or beastly laptop and their own SSD array, anyone can connect to a virtual desktop as they need. There's actually 3x different hardware encoders even on regular consumer GPUs! A 64 core AMD 7R13 Milan is $1000 and will let you load up absurd numbers of GPUs and SSD, that'll host a whole team very effectively.
Really confused why the internet is scaring youso, how you've missed the premise of this article entirely. Maybe you should try booting Sunshine and Moonlight some day, as an easy to DIY low latency low bandwidth VDI.
re: Apple not using Final Cut Pro (FCP). I feel like Apple made an intentional decision to abandon the high end production market when they released FCP 10 in 2011. They dropped multicam, XML import/export, etc. I heard they eventually brought most of these features back but seems clear Apple isn't focusing on this part of the market.
FCP 7 was garbage, which Apple bought from Macromedia. It was never "high end."
The new FCP could have righted many wrongs, but Apple turned its development over to people who didn't even understand industry-standard terms... and who rejected input from experts Apple had hired years earlier. But that's Apple's standard behavior. They just don't learn.
> It is pretty baffling that Apple would create a PR vehicle that impugns its products like this.
I'm struggling to see any of this, frankly. Of course apple uses non-apple software. It'd be pretty weird if they didn't.
All this marketing bullshit reinforces the value of refusing to engage with marketing. What a massive waste of time and effort for all societies and cultures involved.
Avid does have a cloud based solution. This isn't that.
It's a clever way to have your media centralized and yet have access to editors all over the world.
And a modern AVID system does not struggle with a few editors accessing the same footage.
First of all it's usually a proxy format and Secondly the storage can deliver a combined 800MB pr box sustained for x number of editors at the same time.
Nothing these days "struggle(s) with a few editors accessing the same footage".
AVID hasn't been at the forefront of video editing since the Avid/1 / ABVB days. They sell a reasonably usable program with horrible hardware (since Meridien hardware - it's good they finally let us use other hardware such as BlackMagic), but never truly fix large problems. People therefore stay on a specific version of the software for ages, because everyone is scared of new and different bugs.
AVID's shared media offerings are tenfold the cost of other storage options simply because they have a flag on the mounted volumes that tells Media Composer to allow project and media sharing. "800MB pr box sustained" means nothing because anyone can do that easily with commodity hardware.
In other words, AVID is milking their cash cow and they really don't innovate or even try to offer a good product.
Apple, on the other hand, destroyed their professional editing products, then replaced them with decent tools, but ones that are worlds different. Many people have mixed feelings about this. On the other hand, if you want to edit 8K ProRes, Final Cut Pro makes it simple on any ARM-based Mac.
First, facts don't rely on the amount of experience the person sharing them has. But I do get that it's easier to take someone at their word when they have lots of experience, so yes, I've worked on all sorts of projects of all sizes.
I think you've been sold a bunch of ideas. For instance, Avid has no dependency on Blackmagic. They use Open IO, which means you can use any card that supports Open IO, whether Aja, Blackmagic, Bluefish, Matrox, whatever.
Nexus / ISIS isn't special. The flag is literally just a flag that tells Media Composer to enable bin and media sharing. It can be enabled on any kind of sharing - NFS, AFS, SMB, et cetera. For example, check out Mimiq software for enabling it wherever you want.
It's just that you get everything wrong, if you had real experience you would know that.. for a fact.
AVID depends on Blackmagic if you knew what you were talking about you would know (this is where you Google i bet)
The NEXIS hardware/software isn't just a flag, another visit to Google
First, I don't use Google, but that's not the point.
Second, please tell me how the fact that you can use no video interface card or Aja means you're dependent on Blackmagic.
Third, please tell me how bin locking on ISIS / Nexis is different than bin locking on third party shared storage with the AVID sharing flag turned on.
You've offered literally no searchable facts. If I search for anecdotes about how ISIS / Nexis are different, I'm only going to get marketing fluff.
So offer something of substance. Claiming someone is wrong without even saying what they're wrong about is not how any of this works.
AVID sells AVID branded Blackmagic hardware, come again how they aren't dependent, and as you know from your vaste fact based AVID experience that has been a problem since Apple Silicon.
Bin locking is half the story, the AVID Client enables link aggregation to the NEXIS storage, 4 NICS = 4 times the bandwidth, you won't get that with 3rd party hacks.
The NEXIS Storage uses a AVID custom filesystem that does what you claim isn't special, delivers sustained 800MB even if 20 clients are reading files.
Now please, before you make more fact based claims, i have used and still use the 3rd party "bin lock" solutions when I have special cases, and i can promise you, an ordinary file server does not compare when many clients are hitting the storage.
Substance delivered, lets see if there is a chance of someone learning something.
All I can say is that you've been sold selling points. I appreciate the attempt, but:
1) How does AVID selling Blackmagic hardware make that a dependency? You can just as easily buy Aja hardware. "This depends on that" means it requires it. AVID systems do not require Blackmagic hardware at all. If you think they do, please explain.
2) Bin locking and media sharing (client specific paths in "OMFI MediaFiles" and "Avid MediaFiles") is a flag that is either off or on. That has nothing to do with all of the other things that have been sold to you as "special" about AVID storage.
For instance, link aggregation has been built in to macOS since the early days of Mac OS X. Also, it really doesn't matter. If you want something that literally does 4 gigabytes a second, you can do that all sorts of ways with current Macs - no need for multiple NICs.
Anyhow, speed is largely irrelevant for editing systems. The only time speed matters is if the storage can't keep up. You're not watching video at 2,400 frames per second as you're scrubbing through video at 100x speed, so people who are concerned with "800 MB" (you're not even saying per second, or anything like that) are no different than the people who want the wanna-be muscle car that puts out 500 horsepower but that are just going to and from the store. Who cares? If you care, you know. If you have the need, you know. If you're working on 4K uncompressed, you're not doing it on shared storage, anyway - that's just silly. But if you REALLY need to do 4K uncompressed on shared storage, guess what? You're not using AVID, because it can't support that :)
Otherwise, "800 MB" is just a sales number. I just build a NAS for less than $2500 that does 1.2 gigabytes per second, and I wasn't even trying to make it fast.
> i have used and still use the 3rd party "bin lock" solutions when I have special cases, and i can promise you, an ordinary file server does not compare when many clients are hitting the storage.
Those are two different things. If you choose to conflate them, that's up to you, but I can easily show shared storage that makes AVID's look outright pokey, particularly with twenty clients, just as I can show you software that turns the AVID bin locking off or on, so you're not fooling anyone by trying to suggest that all bin locking file servers are somehow inferior, or that they're inferior because they support bin locking, or whatever way you want others to think they're connected.
They're separate things. You do understand that, right?
I hope you take away from this that there are more products than just the half a dozen that are most common, and that products outside of the post world often make products in the post world look ridiculous, if in part because the ones in the post world are a generation older and multiple times the price. But because people in the post world don't know any better, they more often than not spend literally ten times the going price to get something with an AVID sticker on it, even when you can show them that the AVID product is just a rebadged Seagate storage array or whatever.
1) You don't think they depend on Blackmagic when they have sold their hardware to their customers.. what?
2) Let me quote someone, you:
"AVID's shared media offerings are tenfold the cost of other storage options simply because they have a flag on the mounted volumes that tells Media Composer to allow project and media sharing".
So what is what? And i knew you would come with a long write up about "link aggregation" you don't understand it, not a surprise!
But at least you use a car reference, AVID is the Ferrari, the "I CAN BUILD A NAS" is the useless muscle car. And of course they have gear that can handle uncompressed Footage, please, they are the standard for winning an Oscar.
I take away from this that you have no idea of how this work, it's not called "bin locking servers" that's just software. And it's not the hardware that makes a NEXIS special, it's the AVIDFS. I am only writing this for others that might read this so they understand.
I spent some time a while back thinking about a web-native video editing tool with very lightweight client demands. This came up after watching all those LTT videos about their storage & networking misadventures around the editors. It seems something approximating this (or superior to it) has already been developed.
The way you develop & manage the proxies appears to be the biggest part of the battle in making things go fast. There's no reason for editor workstations to be operating with the full res native material unless theres a targeted reason to do so.
LTT is probably not a good/representative example for anything. They'll do infra stunts for content, then it will fail and they'll get content from the failure and content from the new thing. It's in their interest to be slightly on the bleeding edge and slightly janky while having access to subsidised hardware.
And I mean that in a completely positive "it's awesome" way. Just... not the problems anyone else should be facing.
Before Covid your idea was the one everyone was pursuing, including AVID with a embarrassing system that i never saw a in a satisfying version.
With Covid remote access became the norm and the online/proxy workflow more or less died. Avid still has a working version (better than the original) but it's widely used.
Proxies are used for several reasons, expensive storage, heavy codecs at high bitrates or multicams.
They are typically avoided whenever you can because the online part of a proxy based workflow can be a challenge. And especially if you have tight deadlines you want all the variables out of the way.
That is a pile of contradictory statements. And since you're upset by that idea and unwilling to re-read what you wrote, here's some spoon-feeding:
"With Covid remote access became the norm and the online/proxy workflow more or less died"
No; remote access DEMANDS a proxy workflow, since you're not going to edit full-resolution files over the Internet. So it did not "die;" just the opposite. Witness the entire "camera to cloud" marketing mania that swept NAB a few years ago. That's based entirely on the rapid upload of proxy files to begin editing ASAP.
From NAB last year:
“We introduced the [Blackmagic Camera] iPhone app a little while ago,” said Bob Caniglia, director of sales for the company in North America. “You can shoot with that phone, work with the cloud service, share proxies. The camera to Blackmagic cloud to Resolve workflow started with the camera app. The Ursa Broadcast G2 [camera] is now in beta for that software too. That's a good direction on where we're going.”
But back to your assertions: "Proxies are used for several reasons, expensive storage, heavy codecs at high bitrates or multicams. They are typically avoided whenever you can because the online part of a proxy based workflow can be a challenge"
That makes absolutely no sense. You just claimed that proxies are used to avoid "heavy codecs at high bitrates" but then claim "the online part of a proxy based workflow can be a challenge." But you neglected to provide a single example of what's so "challenging" about it, especially when you just cited proxies as an advantage.
Thus, since you pushed the issue, we see that in fact it is you who has no idea what you're talking about. But hey, keep insulting other users.
I Avid too. And manage two sizable (300+ virtualized editors) on-premise VDI systems, and one bigger(somedays 600+) AWS-based one that holds more Adobe than Avid. Remote experience is a bandwidth and latency thing more than anything else, but the technology is limited - for example you can't do a good ProTools system virtualized with a control surface and sync can be a real pain to sort out. As for Avid's solutions to the problem: they do it a couple of ways:
- Composer/Nexis all hosted on Cloud (AWS): fine, but pricy and the Nexis experience is meh
- Composer hosted Cloud/Nexis hosted on Prem: actually works well, but you need to have a direct-connect to AWS (the network can be pricey)
- Composer on on-premise VDI/Nexis hosted on Prem: works really well, and I have a bias towards this instead of fully in cloud for not only security reasons since the TCO is less
- Composer Cloud (or whatever they call it today - used to Composer Sphere): this is a setup where instead you stream real-time proxy to the Composer from MediaCentral. You can download hi-res media if you need to. It works ok, but it more suited for News workflows. Security is a thing with this solution.
- Adobe/OpenDrives on AWS: I mention this, because we do this too. This has all sorts of things to talk about, and is pretty good, but, again, you gotta know what you are doing.
For the on-premise ones, VMWare is our Hypervisor of choice, and, yup, we are looking for other options. And we have all the usual IT problems: domain management, updates, roaming desktops, etc.
If you are looking for 3rd-monitor image viewing (like in the old days with hardware), you can swing NDI or 2110. NDI is ok, and for 2110 you need a network and router to handle it.
The "600+, AWS" detail is great to read, as confirmation that this kind of thind does work. We're urrently setting up remote AWS systems and finding a lot of moving parts for getting smooth playback while editing in AE/PPro.
If you have time to expand on the "bandwith and latency thing", I'd love to hear more. Even a "you need to be geographically within X miles of the instance" ballpark figure would be wonderful to know.
My home internet is a fiber gigabit 3g/3g up/down. Tucked away under the staircase is where my fiber ONT terminates and it is my server room. I have half a dozen boxes running various things. 4 symmetric 2012 i7 mac minis running linux KVM, and hosting various critical services - pihole, home automation, Homekit Secure Video etc.
Then there a giant former gaming PC with 7 HDD bays running the entire storage backend for a whole load of GoPro/Osmo/Insta360 videos I capture. Rclone to Google Photos for back-up. I don't edit any videos. Just there to capture memories so I can at some point when AI tools get good enough just have it generate clips. Same box runs my plex server with HW transcoding.
Then there is the actual gaming PC, a mini-ITX running steam remote play. Has power, a network cable and a fake HDMI dongle that emulates a monitor to trick the GPU into thinking something is actually plugged in.
Basically everything I do with desktop PCs at home is via some sort of remote interface.
Remote gaming is probably the most demanding of all of these. Low-latency HW-accelerated solutions eg: Parsec / steam-link are incredible technologies.
I carry an AppleTV + PS5 controllers to friends' houses and play the latest games across the internet.
The honest answer is that it doesn't work very well in practice. This is seemingly worsened over Wi-Fi on AppleTV whose Wi-Fi stack constantly interrupts streaming in order to do a variety of things with their "location services".
Moonlight works great (over ethernet at least) locally though.
> "In other words, little of the horsepower being used in this editing process is actually coming from the Mac Mini on this guy’s desk... I’m not entirely sure we were supposed to see that, but there it is. Oops."
Sounds like this author didn't watch the whole video. They are completely open about the fact that the editing team collaborated through remoting. At 5:20 an editor specifically says they "remoted into the Mac mini."
The second half of the post raises an arguably good question about the need for fancy Macs when cloud-based workflows only require glorified terminals. But that too may misplaced here -- it's entirely possible that the team members each do local editing work and then host their own collaboration sessions.
Bingo. So many decisions made perfect sense once I realized Apple is basically a lifestyle brand that makes electronics, and Microsoft is a massive bureaucratic B2B conglomerate. Totally explained Microsoft’s ineptitude with consumer facing products (remember Windows Phone? Zune?), yet they have a stranglehold on the business world. This is the opposite: Apple is designed for locking individuals into its lifestyle (or ecosystem, if you prefer), and has mostly given up on enterprise facing products.
TBH it's still possible to use a macbook air as basically a fancy unix-like workstation that has great battery life, and not buy into any of the apple ecosystem. No icloud account, no icloud backup, no iphone, no use of itunes or appletv, no apple synchronization of anything. The day that stops being viable is the day I stop buying them.
The extent of my 'cloud' involvement with apple is the operating system software update mechanism and having an account to download Xcode, so that I can install compiler + macports on a new machine.
Heh, it sure would be nice if they made a computer that was explicitly for getting work done (hell, they could call it a "workstation"). I miss the days when big tech still saw a market for this ...
They do - that’s the point of the Mac Pro. The problem is software. Lots of expensive pcie ports won’t help much when you can’t put a GPU in any of them to use cuda and such.
There’s also so much inefficient, bloated crap that ships with modern macOS that I would never pick it for a proper workstation these days. I have CPU meters in the system tray, and there’s always some stupid process gobbling up all my spare cycles. The other day it was some automatic iPhone backup process. (Why was that using so much cpu, Apple?). Sometimes it’s indexing my hard drive, or looking for faces in photos, or who knows what stupid thing. It’s always something, and its almost always first party software.
In comparison, the cores on my Linux workstation are whisper quiet, and usually idle at 0%. The computer waits for me to give it work.
There is no reason to care about this. There's two or three different mechanisms that stop background processes from having any effect on actual work you're doing.
(Namely background QoS, it only runs on the efficiency cores, and more expensive activities stop when the user is active.)
If you're having an actual specific problem report it with Feedback Assistant. If you aren't, I recommend removing all that useless monitoring stuff and getting an outdoor hobby.
As an actual performance engineer I've basically never in my life gotten a useful report from someone looking at those every day. Although other vibes based bugs like "I feel like my battery life is bad lately" often do find something.
You say that - and then I looked up and saw AMPDevicesAgent sitting at 95% CPU for the past - well, who knows how long. What even is that? Oh, some iphone sync thing. Why is it running while my laptop is on battery? I don't want my battery going flat in order to background sync my phone. In fact, I turned background phone sync off in finder a few days ago. Why is it even running?
Are these processes behaving properly or is it in some stupid infinite loop? I can't tell. Is it considered acceptable by apple for background processes to make my efficiency cores sit at 100% utilisation more or less all the time - even when I'm on battery power? How much will that reduce my laptop's battery life?
I can't tell. I have no way to tell. Its all an opaque jungle of processes running processes. Half of them are buggy half the time, and I don't know which half. It gets more complex and stupid every year.
I swear, macos seemed to run better 10 years ago when I had a computer that was many times slower. Strangely, at the time, there were no constant background processes chewing up CPU all the time like this. Tell me, how is any of this stuff making my computing experience better?
I think my preferred computer has a fast, modern CPU and software from a decade or two ago. Off the top of my head, I can't name a single feature added in macos in the last decade that I actually care about. (Excluding support for modern hardware.)
Huh? I don't find battery life to be that easy to notice. Most of the time I use my laptop, I'm at home - and I'm only on battery power because I sat on the couch and I'm too lazy to reach over and plug my laptop in. The battery goes flat sometimes on zoom calls, or when streaming. But I don't know how many hours I should expect the battery to last while on a zoom call.
The only way I could tell that my battery life has gone down would be by doing actual tests - but those are notoriously difficult - because I can't use my laptop at the same time. (Or, I guess I can - but I'd need to use it the same way across tests). It sounds like days of work to test my battery life with and without transient background tasks. I don't even know how I'd test that - because I don't know how to turn all that stuff off for the control.
I'm also not going to post an issue on apple's bug tracker that I have an intuition that my battery life is worse than it could be. That'd get deleted instantly.
I hear you that complaining online probably won't help. But can't see how complaining about battery life in feedback assistant would help either. The situation is crappy.
"System activity" isn't a valuable user metric because not all CPU %s are equal and CPU % isn't a consumable resource. Fans, battery life, case temperature, some others are.
System activity can certainly cause problems like paging out all the file cache pages you wanted to use when you get back to the machine. It doesn't have to though.
This might be a bit autistic of me, but I don’t trust that random processes sitting on 100% cpu are serving me in any way. I don’t think I want this sort of background process to run on my computer at all.
Are those programs written well, or are they using so many cycles because they’re inefficient and slow? And when did I ever opt in to this? Spotlight has slowly gotten more and more horrible over time. Half the time I use it to invoke system preferences it can’t find it. Or it can’t find the applications folder. If Spotlight is this terrible, why is the hard disk indexer so busy? Is it any better engineered than spotlight? I doubt it. Likewise, I don’t want photoanalysisd looking at my photos. I don’t use that “photos by person” feature. Why does it use hour upon hour of cpu time to make this feature available - just in case I use it later I guess? Get lost.
I really wish Apple stopped adding random crappy features to macOS that I don’t use - but which burn cpu cycles. Instead, fix your shit. Indexing is fine if you make spotlight actually be good again. Photo analysis is useful if I decide it’s useful and turn it on. And maybe if Xcode and SwiftUI weren’t such a buggy, crash ridden, undocumented mess, then maybe, maybe, I’d trust you more to run random background processes.
As it stands, I don’t trust Apple - particularly their application teams - to be good custodians of my cpu.
What are you referring to? Microsoft’s developer tools are top notch. I’d pick visual studio over Xcode any day of the week - Xcode is so crazy buggy that I don’t know how anyone at Apple gets work done on it. And VSCode is probably the most popular ide on the planet.
They own GitHub, they make Visual Studio Code, they made C#/.NET open-source and cross-platform, they added Linux support to Windows (twice), and they created WinGet, just off the top of my head.
I use a MacBook not because it's the best software for development, but because it's the hardest to virtualize.
Our project supports the three major desktop operating systems. I have Windows and Linux VMs that I can switch to when I need to test something on those OS. No serious corporation is going to risks Hackintosh.
They use Oracle mail servers for their corporate e-mail. Ironically, the direct descendant of the Sun Internet Mail Service software I wrestled with back in the early 2000s.
As part of its new U.S. investments, Apple will work with manufacturing partners to begin production of servers in Houston later this year. A 250,000-square-foot server manufacturing facility, slated to open in 2026, will create thousands of jobs.
Previously manufactured outside the U.S., the servers that will soon be assembled in Houston play a key role in powering Apple Intelligence, and are the foundation of Private Cloud Compute, which combines powerful AI processing with the most advanced security architecture ever deployed at scale for AI cloud computing. The servers bring together years of R&D by Apple engineers, and deliver the industry-leading security and performance of Apple silicon to the data center.
There are a number of reasons why the industry centralises. Particularly in post. One of them is the fact that the shot footage is insured and those policies have very strict clauses about handling the material. Yes this applies to an all-digital production as it would have applied to the film era.
Spreading copies around ad-hoc isn’t a backup plan.
They have redundancy and backups.
> Unless the insurance is anti-piracy insurance?
This is a big part of it, actually. Content that leaks prior to launch can reduce revenues significantly. Both from lost viewership due to people already having seen it, and from negative reviews of the unfinished early edits. Many movies change significantly for the better from early cuts.
The comment I was replying to made it sound like this was insurance purely for recorded footage being destroyed, and not it being leaked. The former is very easy to fix by having everyone's workstations keep a copy of what they're working on[0]. But the more copies you have, the easier it is for the footage to leak. The two risks impose different and conflicting mitigation measures.
[0] Remember that one time Pixar rm -rf'd their server and almost lost Toy Story 2 but for one manager who had a local copy of the project at home?
I suspect that's a big reason. Remember about a decade back or so when Fox had four of it's upcoming television shows leaked onto public and private tracker sites about six months before their actual premieres?
Lucifer, Minority Report, Blindspot, and Carmichael were all leaked, and those shows were on different networks, which means it was likely a third-party company that was doing effects in post. I don't recall if it was ever sussed out what exactly happened and now they all got leaked, but it definitely made the industry a bit warier.
The linked promotional materials [0] say that they remote into a mac mini running Avid.
> he works on iMac, which remotes into a separate Mac mini that runs Avid
So the conjecture from the article that the mac mini isn't powerful enough is false
> In other words, little of the horsepower being used in this editing process is actually coming from the Mac Mini on this guy’s desk. Instead, it’s being driven by another Mac on the other side of a speedy internet connection
And based on other comments here, this is a pretty common way to do things.
> So the conjecture from the article that the mac mini isn't powerful enough is false
Not what the article says... and that doesn't follow anyways. The remote experience was terrible and the non-remote experience wasn't shown at all. How fast the Mac Mini theoretically is doesn't matter at all once you have such an insane bottleneck.
> And based on other comments here, this is a pretty common way to do things.
And? The industry is making a mistake by knee-capping its editors. It's going into seconds-per-frame territory in the video, it's as close to unusable as you can get. The article seemed to definitively prove its case that someone desperately needs to step back and look at what the requirements for these editors actually are, rather than ramming conflicting demands into each other to appease the anti-piracy insurance mobsters.
I would like to use this comment to mention Parsec. It's unbelievable how much snappier it feels compared to the default Screen Sharing. What is their secret sauce?!
I just wish it didn't require an internet connection for authentication
Try Moonlight, similar tech but open/no cloud auth. Works better over local networks though as opposed to internet (which you need to set up via vpn/portforward etc)
Sunshine/Moonlight are awesome, but fwiw in this specific context it's worth noting that macOS support with Sunshine is still extremely experimental and janky. It's Homebrew only for now, and when I tried it out last the main release didn't install at all, only the beta. And then locally even over a 10 gig network while the image quality was great the latency was abysmal, even before other oddities. I will say this is enormous improvement over even a year ago, but given the initial gaming focused use case I suspect that (not at all unreasonably!) they've prioritized client capabilities when it comes to Macs for now.
Last I looked, they didn't support passing through USB devices like Wacom tablets or edit controllers or space mice. I am eager for that stuff to work so that I can start using moonlight/sunshine for more of my work.
This bummed me out, but it looks like it's not? From the Sunshine (server) GitHub page[1]:
Sunshine is a self-hosted game stream host for Moonlight. Offering low latency, cloud gaming server capabilities with support for AMD, Intel, and Nvidia GPUs for hardware encoding. Software encoding is also available.
Is there a free alternative to Screen Sharing that is more performant? I'm just surprised at the latency and cpu usage of Screen Sharing on my lan. (Mac specific)
Is it really just authentication? I thought the whole screen data was passed through an intermediate server, but I can see how a peer 2 peer system would be more efficient. I can't imagine the wonky NAT hacks that need to take place though.
>How would it not require an internet connection lmao, it's a remote connection tool
I'm kinda surprised you've managed to be on HN for 5 years and never come across the concept of a "LAN" or "VPN" before, but I guess you're one of today's lucky 10000. To the first, sometimes you have machines (or VMs) local to your own network but in another physical location that you'd like to be able to access from your own system. It's a fairly significant use case, and one where no internet connection is involved whatsoever. For example it's generally desirable to locate powerful (and in turn generally loud) servers and associated gear (including environmental control, redundant power etc) in physically isolated locations from where the humans are working for noise reasons if nothing else, though security and efficiency are important as well. While it's possible to pipe raw video over IP, a quality remote desktop solution will generally be more flexible/scalable and doesn't require special (expensive) extra hardware and potentially additional fiber.
And for systems located on other LANs remote from your own, you can use a VPN to link them securely as if they had a direct physical (though higher latency/more jittery) link, again avoiding any exposure to the public net. That then reduces to the above. In both cases it's desirable to have zero unnecessary 3rd party dependencies.
Ah you got me I guess, didn't think of the VPN case. It does seem like an asterisk in the grand scheme, especially since the applicability of this tech in LANs is very limited (there's no lag in LAN, and it's already internet in the sense that it uses IP, you would need to consider an ethernet framing tool or a unix socket tool like X11 for truly local-remote protocol), so this would only useful in this network-virtualized VPN ecosystem (and also in scenarios where you want to ensure no third party handles the data, by self hosting the server part of parsec)
What is clear to me is that Parsec belongs to a newer breed of remote tools, inline with TeamViewer and AnyDesk, that primarily respond to the need of post ISP firewall era, where by default ports are blocked, so peerless remote tooling becomes harder to install and administer, these have a client-server-client based architecture. And Parsec builds upon this architecture by placing some secret lag reducing sauce on their Server instead of just authenticating and forwarding.
My guess is that they have a proprietary predictive and interpolation based OS algorithm tightly coupled to the OS UI, and this secret sauce lives and is closed source on their backend, so you would kind of need to host a third server in the middle, maybe we will see a competitor for a VPN niche, or an open source alternative.
If an open source solution arises, I bet that it would require an installation of a server, and it would probably start with X11 or Wayland tight coupling.
It's not snark, in your reply you for whatever reason cut out the context at the end of the sentence. "Lucky 10k" is referring to this xkcd comic [0] which I thought was a pretty good one and I've tried to take to heart. I was genuinely surprised, but that's the point, what one thinks is "common sense" or "everyone knows" is always going to be brand new to someone every single day. It's happened to me lots, and is one of the delights of HN, to learn about a whole new set of use cases you've never considered before. In this case maybe it will lead them to consider how it might be useful in their own offices or homes for that matter. Making a powerful machine run quietly is both challenging and can be fairly expensive. But if you have the physical space available, then you may be able to just use powerful, cheap loud fans by virtue of putting it in an area of a basement or the like away from living space/home office and accessing it remotely. Depending on how you do so the quality can be the same as if you were sitting in front of it.
Cloudless Fluid requires a Teams Enterprise subscription. Or one can manually enter IP addresses. Their default is cloud mediation, so yes, they presume a working internet connection.
Video editing is not as portable as coding, there ain't no git. It doesn't surprise me that they have to do that, I imagine it's simply speedier and comfier to connect to a desktop that already has the work in progress in the latest state instead of ensuring everything is synced on different devices one uses. I also imagine that beefy MBPs with M3 and upwards could handle 4K editing of Severance (or maybe 8K) and they'd edit on local machines, should it be actually more convenient than connecting to a remote desktop. It's a bit shameful to admit, but still something we have to deal with while having such crazy advances in technology.
In principle a good editing tool could use Git for the edit operations (mere kilobytes!) and use multi-resolution video that can be streamed and cached locally on demand.
When I got into projection design I tried using git to keep track of my VFX workspace. After typing `git init` I heard a sharp knock at my apartment door. I opened it to find an exhausted man shaking his head. He said one word, “No.” and then walked away.
Undeterred by this ominous warning, I proceeded to create the git repo anyway and my computer immediately exploded. I have since learned that this was actually the best possible outcome of this reckless action.
All jokes aside, it's too big of a pain in the ass to have that stuff version controlled. Those file formats weren't meant to be version controlled. If there's persistent Ctrl-Z that's good enough and that's the only thing non technical people expect to have. Software should be empathetic and the most empathetic way to have the project available everywhere is either give people a remote machine they can connect to or somehow share the same editor state across all machines without any extra steps.
Meta: If I had to rank software features of an NLE when I was employed as an editor, key-to-photon (or click-to-sample, etc) latency would rank #1, far outpacing all other concerns. It's fundamental to the rhythm feel of the result, and prevents fatigue.
Avid bent over backwards to optimize for that in their software. I can't imagine cloud/remote editing being a good artist tool.
Makes sense why the framerate is so bad during some of the playback scenes. Also makes sense as multiple editors will be sharing the same editing tasks and it’s easier to share a single resource with the scenes loaded that are connected to local storage, and manipulate remotely, versus trying to pull that content to your machine and then push it back.
I've been working this way for a long time. Not video editing, but it's the same principle -- I want to be over here (with my monitor, keyboard and mouse) but the large, complex, performance-sensitive environment I need to use is over there.
Jump is excellent, BTW.
The article seems confused though. They say they are confused if Macs are being used to edit the show, but since the editors are remoting from one Mac to another it seems unambiguous.
The flavor of both the local machine and remote machine makes a difference. The OS of the machine you're remoting to makes the biggest difference, but since different OS's have their own ways of handling input devices, the local machine's OS is significant too. Every combo has its quirks, but I find Mac to Mac over Jump to be good.
I really want to find a good solution for this. I used to use low powered devices like Intel NUCs but ended up with a bunch of them so my employer bought me a workstation, it makes so much noise in my office that I barely turn it on so I'm not getting good use of it.
I think some kind of KVM over IP solution would probably be what I need so I can put the workstation in another room.
I consider KVM over IP to be one form of remoting. It hasn't worked as well as Windows RDP or Jump's "fluid" though (for my purposes). The problem has been the video -- it's too high bandwidth, so you end up making tradeoffs (resolution, compression artifacts, latency, refresh rate) and I couldn't really find something that would work for me. RDP works at a higher level of abstraction than "a video signal" so can be a lot more efficient for the "typical desktop computing" that I need. I don't know how the fluid protocol works, but it must use a higher abstraction as well.
The beauty of kvm over ip, though, is that it can work without any software whatsoever being installed on the client. You plug in a usb and video cable and from the client's perspective, it's the same as if you directly plugged in a mouse, keyboard and monitor. But if you can install software on the client (or enable the existing software, as in the case of Windows RDP), you can typically do better.
He's saying that Apple stuff is hard for IT people to configure, customize, and virtualize, but isn't Apple's whole selling point that you don't need to be an IT person to use their products? It's a different market.
I think that's why a lot of tech companies now give their employees Apple laptops (they are easy for employees to self-support) but use everything but Apple in the data centers.
I think Apple and Microsoft are both prepping us for a future in which our computers are, mostly, mere terminals for their host of cloud services, rather than personal computing devices. This may be a test run/demonstration of whether and how a highly interactive, compute-intensive task like video editing can be performed under such a paradigm.
If anything Apple's gone in exactly the opposite direction, given how much effort they've put into having photo processing, Siri, etc happen locally on specialized hardware. Even stuff like their autocomplete is now using invisible-to-the-end-user LLMs running on local hardware.
I've seen NICE DCV be used for this too. Amazon bought them, so it's free if the server end is on AWS, but they will also sell you licenses for your own hardware too. It's essentially 4k60 video streaming where the video is your desktop and they use all the tricks they've developed for media streaming here as well.
It looks like an absolutely brutal way to edit video[1], even with an incredible internet connection. This is a compromise courtesy of the reality of the Apple hardware ecosystem and not some sort of ideal way of working.
Sometimes I play Civilization through an RDP connection to my desktop box below my desk over a dedicated ethernet connection and that's bad enough. Trying to do full video editing, with critical concerns over every pixel, color and timing....oof!
[1] - as they note, you can see him doing it over the remote connection and it looks hurky-jerk disastrous.
I was a diehard PC person but getting colors to display right and consistently on Apple hardware is much easier… so I admitted defeat.
p.s. I’m the guy that will point out that one of your white lightbulbs has a slight greener tint over your other white lightbulbs (aka it’s not slight to me).
It’s crazy how much of a mess color management is on Windows, even now. I used to try to use a calibrator-produced profile for my gaming PC’s monitor but keeping it applied was hacky and it still didn’t work everywhere.
It is pretty obvious that their use of Apple hardware is forced on them by Apple for this show.
As said in TFA, he could have had a Chromebook on his desk. And for that matter he could have been remoted into a massive server from that Chromebook with a cluster of virtualized GPUs, hosting a dozen editors on a monster backbone. Apple has nothing like that, so instead they have like a NAS connected to a dozen Macs back in the office to host a dozen editors. It's super dodgy, and is a limit, and, as is the point of the article, kind of highlights some serious gaps in Apple's hardware ecosystem.
They're using Avid and Ableton for this show, and then some third party remoting to connect to the Macs. This wasn't really an Apple-first production.
If the discussion was about the best way to play games remotely, your curious would be a great sneer. But it isn't. It's about someone doing full-screen video editing over a remote connection. And FWIW, remoting Civilization is a magnitude easier than full-screen video editing, so my comparison was to something much simpler.
I don't only play Civilization. In fact the reason I have the Windows box under my desk is for CUDA work on a big GPU while my main computer is an M4 Mac. And FWIW, Steam Remote Play is utter dogshit compared to RDP. RDP is actually one of the best remoting technologies.
Still can't make highly dynamic desktops super ideal remote.
For all the failings of Google at running the service as a product to consumers, Stadia actually worked. GeForce Now/others are still around. It's absolutely down to the connection, but the technology's there.
Indeed, I still have a GeForce Now "founders" subscription as my son uses it, and I did originally use it to scratch the Civilization itch. At least until 2k got greedy and removed it.
But...wait...just looking and it appears that Civilization has joined GFN again. Apparently they saw GFN as a selling point for 7 so they offered it again. Huh.
> It turns out that RDP is one of the best remoting technologies.
I was very surprised by this too. I think it was Windows 8.1, when going from one machine to another, was basically a no-compromise experience for most gaming, except for FPS—the latency was always a little too high.
Nowadays I can use Parsec over WiFi at 4K and almost can't tell the difference. Almost. And only with a controller.
There's a lot of things that are possible and even adequate, but not a good idea unless you're sure that the org will not cheap out on Internet connection or other necessary infra.
A colleague used to work for Apple, outside the US, and described his development environment as SSH’ing into a physical machine in Cupertino and working exclusively in a terminal with 100ms latency, because they weren’t allowed on site machines.
> If you want to run a Mac in the cloud, it has to be a full machine in most cases.
If cloud companies have the opportunity they will divide the resources of that Mac into 30 vms and then meter access to the point where it would have been cheaper to go out and acquire the hardware yourself.
Unpopular opinion, but Apple should stick to its guns and maybe create a physical Mac rack server with legal and technical restrictions on maximum tenancy.
> If cloud companies have the opportunity they will divide the resources of that Mac into 30 vms and then meter access to the point where it would have been cheaper to go out and acquire the hardware yourself.
That’s…how cloud computing is supposed to work? You pay them a premium to set this up. The fact that this isn’t possible is why everyone is annoyed.
a super easy way to work on big video files and not worry about the hassle of remote desktop and the back and forth with the team, versioning, etc.. is lucidlink (https://www.lucidlink.com/). A content creation collaboration tools lots of studios use. The app makes accessing cloud files as smooth and fast on your laptop as if they were local.
I will post a text to a friend of mine from a day ago: "I use my iphone to access pwas on my server so I can use it as a computer. I use my computer for x forwarding so I can use my server's programs."
I'm not the norm but isn't it telling when I don't want to use your hardware, I have to? I want to enjoy these products, but their immutability compared to prior versions is a thorn in my side.
I don't get that at all. I've been doing something like this since before the iPhone ever existed because I like having a logfile of all of my inputs and outputs, and I like being extremely mobile. Immutability means that if my phone gets smashed or stolen, I can just go buy a new one and not lose my shit. What's the thorn?
These days I think do something very similar to your friend. I pack an external keyboard and mouse I like, and a chromecast (just in case) and that's it. Not packing a big screen and a battery is just a huge bonus.
The thorn is that if I wanted to run a python program, download a mp4 and convert it or to edit a html file (all things I got hamstrung doing recently), I couldn't because the IPhone restricts access to its shell, as well as several common features in its forced web driver you can't opt out of. My phone is immutable yes, but what's the point if its immutable state doesn't do what you want it to
I have absolutely no experience with video editing, so I can't say if it's good or not, but the point the article is making is very clear:
Put another way, if Stiller's team was building this for Amazon or Netflix, would that be a Mac Mini on Richman’s desk, or an HP or Lenovo box? Why even use a Mac in this editing process at all, when other companies offer access to better GPUs anyway?
[...]
Sure, there's an Apple logo in the top-left corner (two, actually), but it feels superfluous, knowing that the software isn’t directly on the machine and it [could] just as easily be running on a Windows or Linux box a thousand miles away. There are way more efficient ways to do this, and Apple doesn't offer them. Instead it relies on cloud providers like MacStadium, or localized IT teams, to work around their convoluted rules around VMs.
So the client is irrelevant (it's just a terminal), and a non-Apple server would be a better option. (Again, I have no idea if any of this is actually true.)
The point of the article, and the full quote, is "These editors aren't working on Macs, per se. They're working around them".
The point being ignored is that the editing software is also running on Macs [1]. The local and remote machines are both Macs. To the general public, and the purpose of their story, the fact that you could replace the local one with any dumb client is irrelevant [2].
[1] I don't know about the software they're using in particular (Avid?), but there is a chance the Macs are actually faster than what you can currently get from PCs regardless of GPU - nothing atm matches the bandwidth you can get from Apple Silicon, ProRes hardware encoders, power consumption, noise
[2] it's not even actually irrelevant as the overall experience of owning a Mac, outside of the remote desktop client, will be substantially different.
One more time, I'm not arguing for the article's point -- as I wrote, I have absolutely no experience with video editing, so I have no idea what are the demands regarding hardware/software.
I was simply clarifying "what's the point of the article". Your quote fragment ("These editors aren't working on Macs") makes it look like the article is saying something it's obviously not (as it's clear from the very next sentence).
To me, not sitting in front of a Mac, but using it via remote desktop, doesn’t change the fact the show is being edited on Macs. Which makes me not see the point in the article.
It’s like seeing a car ad where someone is being driven around by a chauffeur in a BMW, then arguing that “they are not actually driving it, it could be any car”.
> [...] doesn’t change the fact the show is being edited on Macs. Which makes me not see the point in the article.
I don't know what else to say, other than to show again this full quote: "These editors aren't working on Macs, per se. They're working around them". The point is not that they're not using Macs, he knows they are.
What you're doing is like seeing the scene[1] from the Watchmen movie where a character in prison says "I'm not locked in here with you. You're locked in here with me", and then thinking "Wow, that character was totally wrong! Of course he was locked in there with them, he was in a prison!".
I'm surprised at this point Apple still doesn't have some sort of solution for cloud/remote editing integrated into Final Cut. What I mean is a native desktop GUI but with the video files streaming from a remote location for the previews, thubmnails, etc. Heck, the GUI could even be a web app.
If the GUI is running locally I don't think latency would be that bad given that you're not on the other side of the world and you have a decent connection.
I do this all the time and get laughed at. I try to explain the exact same reasons but no one pays attention. I guess I just needed the Big Tech gatekeepers to tell the sheep that it works. Among sheep, it's not about the message but the messenger.
I kind of think the 80TB of video files might have contributed to that? Maybe it was easier to use the jump desktop to do it on another computer than it would have been to copy and pass around the video files?
Couldn't the traffic be LAN? Everyone keeps mentioning 'over the internet' - the device they're doing the editing on could be in a different room in the same building over gigabit++ speeds.
The reason is to never allow anyone (even the editors) to have actual access to the show's files/images. Remote software can prohibit copy/paste, file transfers, and screenshots. I worked in a post facility with 100s of people all remoting in to a server rack down the hall.
Nope cloud and local processing is always gonna be 2 things and not one will replace the other. Cloud has been around, and if you look at games nobody wants to play their game thru a service like Stadia.
Ultimately / objectively I agree, but subjectively I'm not so sure -- I recently gave NVIDIA GeForce NOW a whirl (cloud gaming, $19.99/month (cheaper prepaid) for an 'Ultimate' account which instantly connects you to an RTX 4080 VM + HDR + max 240fps) and it just works. Super smooth, realtime gameplay at max graphics.
I wanted to test it out given that my son was looking to upgrade his PC and not only are component costs through the roof, there's barely any inventory to be had if you were trying to buy exactly what you want! (thanks, resellers...)
It's not a perfect setup obviously -- really only ideal for AAA games with cloud saves, no mods, etc (Cyberpunk 2077, that kind of thing), and I won't make the argument that it's ultimately better than local for gaming (it's not), but I will say that in my experience, the hardware-rendered framerates are through the roof, it streams seamlessly at high resolution, and I could see envision a scenario where video editing on an appropriate VM should be virtually indistinguishable from local.
Looks good, don't see the drawback for this usecase
"These editors aren't working on Macs, per se. They're working around them. Sure, there's an Apple logo in the top-left corner (two, actually), but it feels superfluous, knowing that the software isn’t directly on the machine and it just as easily be running on a Windows or Linux box a thousand miles away"
But the source AND target of the remote connections are both macs, pretty straightforward
the point they make is that if you're using a remote desktop program to remotely edit videos you don't need a mac on the client side.
what would have been a far better PR is if Final Cut offered enterprise solution with "server" part that holds videos and does the heavy lifting and "client" part that works with miniatures doesn't let you export anything to disk and all that
I'm not advocating for vendor locking, it's not what I said. I said if they wanted to advertise macs for enterprise video editing they should have implemented this in FCP natively.
This however wouldn't be vendor locking because remoting would still be an option.
Instead they've shown that remote desktop is the most reliable way to do this kind of job because there isn't anything better.
Well, it is similar to RDP with H.264 (and yes, you can do H.264 in RDP, and yes, the text has no artifacts), but not RDP where it regards authentication.
I mean, of course. The source video files for an entire season of 4K TV are friggin' huge, and you want different editors to be able to work in different locations.
The article argues:
> To me, though, it highlights a huge issue with Apple’s current professional offerings. They are built to work on a single machine. At least for high-end use cases, the remote workflow threatens to cut them out of the equation entirely...
This is hardly a "huge issue". Plenty of people work on a single machine. Once your project gets too big, you move more and more to remote and cloud. It's a spectrum, and you want a machine flexible enough to handle both.
Kind of funny to me that they have to go so "thin client" with this.
You'd think there'd be some kind of "mipmap gateway" component to network-aware video editors, that incrementally re-renders scrub-quality and preview-quality renders of the timeline as the client tells it about project changes, and then streams those rendered changes back down the pipe to the client, proactively, into a local cache — without the client ever needing to (or even being allowed to!) hold the raw assets.
Then the local "fat client" editing UI could be snappy at pretty much all times — except for just after modifying the timeline, when it'd have to flush (some variable amount of) the preview cache. (And even then, the controls would still respond; just the preview and timeline-thumbnails would jitter, until the [active part of the] re-cache finished.)
Would this enable piracy? No! Who's going to want to release a 480p rip of a TV episode at this point? (And 480ps is all you need, for a functional live preview, when lining up ADB or B-roll or whatever else. Anything needing closer examination — VFX, say — could be rendered and sent by the gateway "on demand", as stills [on play-head stop] or as short clips [on first play after range-selection].)
(It would enable leaks... but so does RDP, if you combine it with local video-capture software. So that's nothing new.)
> requirements in its EULA that seem designed to protect its hardware business above all else
To get this you have to understand Apple's business model. They sell style, quality, and exclusivity, and ease of use. They can't ensure those things if they separate the hardware from the software. I'm sure they would love to make money from software licenses without the hardware. But it would end up creating new problems that would dilute the value of their product.
The proof is in the pudding. They're the most valuable company in the world because of their limitations, not despite them.
Doing video editing full-time over remote desktop must be painful. Perhaps a 20% productivity hit from that decision alone?
Kinda surprised professional video editing software hasn't been designed with this exact use case in mind - ie. A worker running local software doing editing, but then a remote server with tens of Tbytes of storage and high power GPU's.
The software would do standard definition and basic rendering fast locally, and simultaneously request the 8k data be rendered remotely and downloaded so a full fidelity preview can be seen after a few seconds.
Why? Storage is cheap, and desktop computers have plenty of power to edit. Why on earth would you want to try to stream all of this shit over the Internet?
Oh yeah, this is completely true. It would be a shoebox that barely ran Windows 98 and it wouldn't make much difference, and Apple's tools have completely failed to keep pace with this reality.
>...please use the original title, unless it is misleading or linkbait; don't editorialize.
>Please don't use HN primarily for promotion. It's ok to post your own stuff part of the time, but the primary use of the site should be for curiosity.
He said, "We don't collaborate at Apple because of the (perceived) risk of leaks. None of our tools are built for collaboration". Apple is famously closed about information sharing, to the point where on some floors every office requires its own badge, and sometimes even the cabinets within.
So it doesn't surprise me that their video editing tools are designed for a single user at a time.
Edit: This happened about six years ago, they have since added some collaboration tools, however it's more about the attitude at Apple in general and why their own tools lag on collaboration.
Edit 2: After the replies I thought I was going crazy. I actually checked my message history and found the discussion. I knew this happened pre-COVID, but it was actually in 2013, 12 years ago. I didn't think it was that long ago.
reply