Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

作者有考虑兼容livePhoto的显示吗? #2297

Open
3 tasks
Arlexovincy opened this issue Sep 25, 2024 · 22 comments
Open
3 tasks

作者有考虑兼容livePhoto的显示吗? #2297

Arlexovincy opened this issue Sep 25, 2024 · 22 comments

Comments

@Arlexovincy
Copy link

Check List

Thanks for considering to open an issue. Before you submit your issue, please confirm these boxes are checked.

Issue Description

作者有考虑兼容livePhoto的显示吗?自己去处理LivePhoto的缓存与显示的问题非常复杂,自己有点力不从心,希望作者能考虑下,谢谢

What

[Tell us about the issue]

Reproduce

[The steps to reproduce this issue. What is the url you were trying to load, where did you put your code, etc.]

Other Comment

[Add anything else here]

@onevcat
Copy link
Owner

onevcat commented Sep 30, 2024

@Arlexovincy

如果我没有记错的话,从网络下载的 Live Photo 应该是分成静态 image (HEIC) 和一个视频 (MOV) 的,所以下载和缓存也许要分别处理。能分享一些你们使用 Live Photo 的实际场景么(比如你们的 Live Photo 的来源,以及server是如何下发这些资源的),这样可能可以对设计更合理的API有所帮助。

感谢。

@Arlexovincy
Copy link
Author

@onevcat

非常感谢作者百忙之中回复。
我们正在做一个情绪治愈类的App,记录瞬间所思所想。类社交App。我们的使用场景是这样的:
1、我们的用户可以在我们的app上发布帖子记录,帖子里面有多张图片,我们目前仅支持gif和png,jpg等图片,但是我们希望同时支持Live Photo。我们的图片目前的容器载体是collectionview,cell中的UIImageView显示。
2、如果支持Live Photo,我们会从相册里获取,然后把Live Photo分离成一张图片和一个人MOV格式的视频,都放在我们的OSS存储服务器上,并且我们能够获取相应的链接。
3、如何支持Live Photo后,如果可能,希望我们的kingfisher能够自动帮我们缓存相应的文件,并且设计出一个View,改UIview可以根据URL的参数来判断是gif、png、还是Live Photo,然后来进行使用评估不同类型的容器(UIImageView、Live PhotoView)显示,Live Photo还希望开发者能够手动控制播放功能。

再次非常感谢作者的回复,期待我们的kingfisher的下个版本更新。

@onevcat
Copy link
Owner

onevcat commented Sep 30, 2024

依据 URL 或者甚至 response data 来决定内部使用的 view 这件事情,相对来说超出了 Kingfisher 的最初设计目标。

可能可以考虑为 PHLivePhotoView 添加一组类似 kf 的 extension 方法来快速下载/缓存资源和设置图片。对应地,在你们 app 中 server 应该持有“某张图片是 Live Photo” 的信息:在 Collection View 的 cell 中使用 UIImageView 加载普通图片(以及 live photo 的 badge),在点击 Live Photo 所在 cell 后的 detail 页面中,再使用 PHLivePhotoView 和新加的图片设置方法,可能不论从使用逻辑,消耗的流量,和 Kingfisher 本身的改动难度上权衡,都相对是更合理的选择。

您怎么看?

@onevcat
Copy link
Owner

onevcat commented Sep 30, 2024

当然..如果确实有需要在 Cell 里就混杂着显示 Live Photo 的话,也可以考虑通过向 server 请求 URL 的时候,让 server 的返回里带上对应图片是 Live Photo 的 metadata,这样就可以在创建 cell 时直接选择使用 UIImageView 还是 PHLivePhotoView

@Arlexovincy
Copy link
Author

@onevcat

我觉得这个方案非常不错,也很适合当前kingfisher的设计的一致性。

非常期待这个版本的到来,谢谢!

@Arlexovincy
Copy link
Author

当然..如果确实有需要在 Cell 里就混杂着显示 Live Photo 的话,也可以考虑通过向 server 请求 URL 的时候,让 server 的返回里带上对应图片是 Live Photo 的 metadata,这样就可以在创建 cell 时直接选择使用 UIImageView 还是 PHLivePhotoView

这个问题都不大,只要我和服务端约定好协议,就可以区分了

@Arlexovincy
Copy link
Author

@onevcat 作者您好!
我看到master分支已经支持Live Photo功能了,但是最新的Tag, 8.0.3还未支持,是还需要做一些测试吗还是说要晚一些才发布新的Tag?

@onevcat
Copy link
Owner

onevcat commented Oct 12, 2024

还有一些工作要做,最后的一部分重构 补充文档之类的。着急的话可以先用master试试,也欢迎反馈和意见。

@zkhCreator
Copy link

所以现在已经是好了的样子?

@HIIgor
Copy link

HIIgor commented Oct 30, 2024

@onevcat 猫神你好。
我们在项目中用到了kf来呈现livephoto,反馈一个问题,用我们自己上传到存储服务器的url,Live 图会闪一下,然后变成空白,代码如下:

        guard let videoURL = URL(string: "https://wx-love-img.afunapp.com/ff43cec0e1b1bd63f700d5065290338b"),
              let imgURL = URL(string: "https://wx-love-img.afunapp.com/7a478877eb27b7531a97c2c1dc1d21fe")
        else { return }
        
        let source = LivePhotoSource([
            LivePhotoResource(downloadURL: imgURL, fileType: .heic),
            LivePhotoResource(downloadURL: videoURL, fileType: .mov)
        ])
        
        livePhotoView.kf.setImage(with: source, completionHandler: { result in
            switch result {
            case .success(let r):
                print("Live Photo done. \(r.loadingInfo.cacheType)")
                print("Info: \(String(describing: r.info))")
                self.livePhotoView.startPlayback(with: .full)
            case .failure(let error):
                print("Live Photo error: \(error)")
            }
        })

开发环境是在iOS 18.0.1、Xcode 16。查到的原因可能是load下来的图片视频缺失metadata,我们本地额外处理一下这块,就可以正常播放,处理的逻辑如 https://juejin.cn/post/7222229682027610149?searchId=20241030174923E793C0448C99F5168EF8#heading-5 该文档中描述的 addIdentifier 的实现。.
我在想额外处理的这部分内容能否嵌入到Kingfisher内部去,或者提供接口可以在外部处理,您看呢

@onevcat
Copy link
Owner

onevcat commented Oct 31, 2024

@HIIgor 看起来这张图片和对应的视频并不是严格按照 Apple 的方式导出的。

我尝试了一个非常简陋的实现(在这个 branch 可以找到。别用,从缓存加载时会 crash),但是发现如果都按照这个方式的话,看起来性能上退化会比较严重:在 iPhone 16 上视频需要1.6s左右,图片需要0.3秒左右进行处理。可能不太能无脑为所有图片/视频都进行添加。我看一下有没有什么办法能优化和检查一下。如果最后能做到不影响性能的话,会考虑添加到内置;但是如果没有特别好的方式的话,就只能给一个 delegate 让用户自行判断和添加了。

最后还是建议这部分 meta data 的处理,应该还是要预先做好(不论是在服务器还是在上传方的客户端)。在下载和显示端,还是希望能获取到可以直接显示的内容。还请参考。

@zkhCreator
Copy link

针对于这个,我的做法是在下载的时候将逻辑进行转换,一开始也考虑过在上传的时候进行处理。但是后来因为某些原因,线上已经上去了一波 live 的图,如果要兼容的话,得重复写2遍逻辑,感觉后面维护有点麻烦,所以就直接在 Downloader 处理了。:

class CustomLivePhotoDownloader: ImageDownloader, @unchecked Sendable {
    let identifier: String
    
    init(identifier: String) {
        self.identifier = identifier
        super.init(name: "com.livePhoto.downloader-\(identifier)")
    }
    
    override func downloadImage(
        with url: URL,
        options: KingfisherParsedOptionsInfo,
        completionHandler: (@Sendable (Result<ImageLoadingResult, KingfisherError>) -> Void)? = nil) -> DownloadTask {
            return super.downloadImage(with: url, options: options) { result in
                switch result {
                case .success(let imageResult):
                    self.parseImageResult(url: url, result: imageResult, completionHandler: completionHandler)
                case .failure:
                    completionHandler?(result)
                }
            }
    }
    
    private func parseImageResult(url: URL, result: ImageLoadingResult, completionHandler: (@Sendable (Result<ImageLoadingResult, KingfisherError>) -> Void)?) {
        guard let completionHandler = completionHandler else {
            return
        }
        
        if url.pathExtension == "mov" {
            Task {
                print("开始处理视频数据 \(result.url?.absoluteString ?? "")")
                do {
                    let newResult = try await self.parseMovieResult(result: result)
                    print("完成处理视频数据 \(result.url?.absoluteString ?? "")")
                    completionHandler(newResult)
                } catch {
                    completionHandler(.failure(.requestError(reason: .emptyRequest)))
                }
            }
        } else {
            Task {
                do {
                    print("开始处理图片数据 \(result.url?.absoluteString ?? "")")
                    let newResult = try await self.parseImageResult(result: result)
                    print("完成处理图片数据 \(result.url?.absoluteString ?? "")")
                    completionHandler(newResult)
                } catch {
                    completionHandler(.failure(.requestError(reason: .emptyRequest)))
                }
            }
        }
    }
    
    private func parseImageResult(result: ImageLoadingResult) async throws -> Result<ImageLoadingResult, KingfisherError> {
        let resultData = try await LivePhotosUtils.sharedInstance.addIdentifier(identifier, toPhotoData: result.originalData)
        return .success(.init(image: result.image, url: result.url, originalData: resultData))
    }
    
    private func parseMovieResult(result: ImageLoadingResult) async throws -> Result<ImageLoadingResult, KingfisherError> {
        let resultData = try await LivePhotosUtils.sharedInstance.addIdentifier(identifier, toMovieData: result.originalData)
        return .success(.init(image: result.image, url: result.url, originalData: resultData))
    }
}

本来想在 Processor 里面处理,但是发现在 live 场景下 processor 被写死了,只能在往前一步了

@onevcat
Copy link
Owner

onevcat commented Nov 6, 2024

@zkhCreator 感谢分享,情况了解了。可以问一下这么做的性能上的表现如何么?比如对于视频的 addIdentifier 需要多久。不知道是否方便分享一下 addIdentifier(_:toMovieData:) 的实现。

另外,其实可以考虑实现一下这个 delegate,https://github.com/onevcat/Kingfisher/blob/master/Sources/Networking/ImageDownloaderDelegate.swift#L75-L96

可能相比 subclass 一个 downloader,会更好看一些。

@zkhCreator
Copy link

zkhCreator commented Nov 6, 2024

extension PHAssetResource: @unchecked @retroactive Sendable {}
extension AVAssetTrack: @unchecked @retroactive Sendable {}
extension PHLivePhoto: @unchecked @retroactive Sendable {}
extension AVAssetReader: @unchecked @retroactive Sendable {}
extension AVAssetWriterInput: @unchecked @retroactive Sendable {}
extension AVAssetReaderTrackOutput: @unchecked @retroactive Sendable {}
extension AVAssetWriter: @unchecked @retroactive Sendable {}

enum LivePhotosDisassembleError: Error {
    case requestDataFailed
    case noFilenameExtension
    case noImageData
}

enum LivePhotosAssembleError: Error {
    case addPhotoIdentifierFailed
    case createDestinationImageFailed
    case writingVideoFailed
    case writingAudioFailed
    case requestFailed
    case loadTracksFailed
    case loadMovieResultFailed
    case noCachesDirectory
}

public actor LivePhotosUtils {
    public static let sharedInstance = LivePhotosUtils()
    
    public func isLivePhoto(item: PHPickerResult) async -> Bool {
        return await item.itemProvider.isLive()
    }
}

public struct LivePhotoParsedModel: Sendable {
    public let photoUrl: URL
    public let movieUrl: URL
    public let image: UIImage
    
    public init(photoUrl: URL, movieUrl: URL, image: UIImage) {
        self.photoUrl = photoUrl
        self.movieUrl = movieUrl
        self.image = image
    }
}

// MARK: - disassemble
public extension LivePhotosUtils {
    // 图片解码
    func disassemble(livePhoto: PHLivePhoto) async throws -> LivePhotoParsedModel {
        let assetResources = PHAssetResource.assetResources(for: livePhoto)
        let list = try await withThrowingTaskGroup(of: (PHAssetResource, Data).self) { taskGroup in
            for assetResource in assetResources {
                taskGroup.addTask {
                    return try await withCheckedThrowingContinuation { continuation in
                        let dataBuffer = NSMutableData()
                        let options = PHAssetResourceRequestOptions()
                        options.isNetworkAccessAllowed = true
                        PHAssetResourceManager.default().requestData(for: assetResource, options: options) { data in
                            dataBuffer.append(data)
                        } completionHandler: { error in
                            guard error == nil else {
                                continuation.resume(throwing: LivePhotosDisassembleError.requestDataFailed)
                                return
                            }
                            continuation.resume(returning: (assetResource, dataBuffer as Data))
                        }
                    }
                }
            }
            var results: [(PHAssetResource, Data)] = []
            for try await result in taskGroup {
                results.append(result)
            }
            return results
        }
        guard var photo = (list.first { $0.0.type == .photo }),
              let video = (list.first { $0.0.type == .pairedVideo }) else {
            throw LivePhotosDisassembleError.requestDataFailed
        }
        
        let (imageData, image) = try compressImage(imageData: photo.1)
        photo.1 = imageData
        
        let cachesDirectory = cachesDirectory()
        let photoURL = try save(photo.0, data: photo.1, to: cachesDirectory, fileExtension: "jpeg")
        let videoURL = try save(video.0, data: video.1, to: cachesDirectory)
        return LivePhotoParsedModel.init(photoUrl: photoURL, movieUrl: videoURL, image: image)
    }
    
    private func save(_ assetResource: PHAssetResource, data: Data, to url: URL, fileExtension: String? = nil) throws -> URL {
        guard let ext = fileExtension ?? UTType(assetResource.uniformTypeIdentifier)?.preferredFilenameExtension else {
            throw LivePhotosDisassembleError.noFilenameExtension
        }
        let destinationURL = url.appendingPathComponent(NSUUID().uuidString).appendingPathExtension(ext as String)
        
        // Create the directory if it doesn't exist
        try FileManager.default.createDirectory(at: destinationURL.deletingLastPathComponent(), withIntermediateDirectories: true, attributes: nil)
        
        try data.write(to: destinationURL, options: [Data.WritingOptions.atomic])
        return destinationURL
    }
    
    private func compressImage(imageData: Data) throws -> (Data, UIImage) {
        guard let image = UIImage(data: imageData), let imageData = image.jpegData(compressionQuality: 0.9) else {
            throw LivePhotosDisassembleError.noImageData
        }
        return (imageData, image)
    }
}

// MARK: - Assemble
public extension LivePhotosUtils {
    // 图片编码成 LivePhoto
    func assemble(photoURL: URL, videoURL: URL, progress: ((Float) -> Void)? = nil) async throws -> (PHLivePhoto, (URL, URL)) {
        let cacheDirectory = try assembleCachesDirectory()
        let identifier = UUID().uuidString
        let pairedPhotoURL = try addIdentifier(
            identifier,
            fromPhotoURL: photoURL,
            to: cacheDirectory.appendingPathComponent(identifier).appendingPathExtension("jpg"))
        let pairedVideoURL = try await addIdentifier(
            identifier,
            fromVideoURL: videoURL,
            to: cacheDirectory.appendingPathComponent(identifier).appendingPathExtension("mov"),
            progress: progress)
        
        let livePhoto = try await combinePhotoLive(imageURL: pairedPhotoURL, movieURL: pairedVideoURL)
        return (livePhoto, (pairedPhotoURL, pairedVideoURL))
    }
    
    func combinePhotoLive(imageURL: URL, movieURL: URL) async throws -> PHLivePhoto {
        let livePhoto = try await withCheckedThrowingContinuation({ continuation in
            PHLivePhoto.request(
                withResourceFileURLs: [imageURL, movieURL],
                placeholderImage: nil,
                targetSize: .zero,
                contentMode: .aspectFill) { livePhoto, info in
                    if let isDegraded = info[PHLivePhotoInfoIsDegradedKey] as? Bool, isDegraded {
                        return
                    }
                    if let livePhoto {
                        continuation.resume(returning: livePhoto)
                    } else {
                        continuation.resume(throwing: LivePhotosAssembleError.requestFailed)
                    }
                }
        })
        return livePhoto
    }
        
}

// MARK: --- Assemble Photo

extension LivePhotosUtils {
    private func addIdentifier(_ identifier: String, fromPhotoURL photoURL: URL, to destinationURL: URL) throws -> URL {
        let imageSource = CGImageSourceCreateWithURL(photoURL as CFURL, nil)
        let destinationType = UTType.jpeg.identifier as CFString
        let destination: CGImageDestination? = CGImageDestinationCreateWithURL(destinationURL as CFURL, destinationType, 1, nil)
        
        try addIdentifierToImage(identifier: identifier, imageSource: imageSource, destination: destination)
        
        return destinationURL
    }
    
    public func addIdentifier(_ identifier: String, toPhotoData photoData: Data) throws -> Data {
        let imageSource = CGImageSourceCreateWithData(photoData as CFData, nil)
        let destinationType = UTType.jpeg.identifier as CFString
        let mutableData = NSMutableData()
        let destination: CGImageDestination? = CGImageDestinationCreateWithData(mutableData, destinationType, 1, nil)
        
        try addIdentifierToImage(identifier: identifier, imageSource: imageSource, destination: destination)
        
        return mutableData as Data
    }
    
    private func addIdentifierToImage(identifier: String, imageSource: CGImageSource?, destination: CGImageDestination?) throws {
        guard let imageSource = imageSource,
              let imageRef = CGImageSourceCreateImageAtIndex(imageSource, 0, nil),
              var imageProperties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, nil) as? [AnyHashable : Any],
              let destination = destination else {
            throw LivePhotosAssembleError.addPhotoIdentifierFailed
        }
        
        let identifierInfo = ["17" : identifier]
        imageProperties[kCGImagePropertyMakerAppleDictionary] = identifierInfo
        
        CGImageDestinationAddImage(destination, imageRef, imageProperties as CFDictionary)
        
        if !CGImageDestinationFinalize(destination) {
            throw LivePhotosAssembleError.createDestinationImageFailed
        }
    }
}

// MARK: --- Assemble Movie

extension LivePhotosUtils {
    public func addIdentifier(
        _ identifier: String,
        toMovieData toData: Data,
        progress: ((Float) -> Void)? = nil
    ) async throws -> Data {
        // 创建资源
        guard let tempUrl = TemplateFileStorage.save(data: toData, fileExtension: "mov") else {
            throw LivePhotosAssembleError.writingVideoFailed
        }
        
        let toDataDestination = TemplateFileStorage.filePath(directoryPath: [], fileName: UUID().uuidString, pathExtension: "mov")
        let result = try await addIdentifier(identifier, fromVideoURL: tempUrl, to: toDataDestination, progress: progress)
        TemplateFileStorage.deleteFile(at: tempUrl)
        
        guard let data = TemplateFileStorage.readFile(at: result) else {
            TemplateFileStorage.deleteFile(at: result)
            throw LivePhotosAssembleError.loadMovieResultFailed
        }
        
        TemplateFileStorage.deleteFile(at: result)
        
        return data
    }
    
    private func addIdentifier(
        _ identifier: String,
        fromVideoURL videoURL: URL,
        to destinationURL: URL,
        progress: ((Float) -> Void)? = nil
    ) async throws -> URL {
        
        let asset = AVURLAsset(url: videoURL)
        // --- Reader ---
        
        // Create the video reader
        let videoReader = try AVAssetReader(asset: asset)
        
        // Create the video reader output
        guard let videoTrack = try await asset.loadTracks(withMediaType: .video).first else { throw LivePhotosAssembleError.loadTracksFailed }
        let videoReaderOutputSettings : [String : Any] = [kCVPixelBufferPixelFormatTypeKey as String : kCVPixelFormatType_32BGRA]
        let videoReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderOutputSettings)
        
        // Add the video reader output to video reader
        videoReader.add(videoReaderOutput)
        
        // Create the audio reader
        let audioReader = try AVAssetReader(asset: asset)
        
        // Create the audio reader output
        guard let audioTrack = try await asset.loadTracks(withMediaType: .audio).first else { throw LivePhotosAssembleError.loadTracksFailed }
        let audioReaderOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil)
        
        // Add the audio reader output to audioReader
        audioReader.add(audioReaderOutput)
        
        // --- Writer ---
        
        // Create the asset writer
        let assetWriter = try AVAssetWriter(outputURL: destinationURL, fileType: .mov)
        
        // Create the video writer input
        let videoWriterInputOutputSettings : [String : Any] = [
            AVVideoCodecKey : AVVideoCodecType.h264,
            AVVideoWidthKey : try await videoTrack.load(.naturalSize).width,
            AVVideoHeightKey : try await videoTrack.load(.naturalSize).height]
        let videoWriterInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoWriterInputOutputSettings)
        videoWriterInput.transform = try await videoTrack.load(.preferredTransform)
        videoWriterInput.expectsMediaDataInRealTime = true
        
        // Add the video writer input to asset writer
        assetWriter.add(videoWriterInput)
        
        // Create the audio writer input
        let audioWriterInput = AVAssetWriterInput(mediaType: .audio, outputSettings: nil)
        audioWriterInput.expectsMediaDataInRealTime = false
        
        // Add the audio writer input to asset writer
        assetWriter.add(audioWriterInput)
        
        // Create the identifier metadata
        let identifierMetadata = metadataItem(for: identifier)
        // Create still image time metadata track
        let stillImageTimeMetadataAdaptor = stillImageTimeMetadataAdaptor()
        assetWriter.metadata = [identifierMetadata]
        assetWriter.add(stillImageTimeMetadataAdaptor.assetWriterInput)
        
        // Start the asset writer
        assetWriter.startWriting()
        assetWriter.startSession(atSourceTime: .zero)
        
        // Add still image metadata
        let frameCount = try await asset.frameCount()
        let stillImagePercent: Float = 0.5
        await stillImageTimeMetadataAdaptor.append(
            AVTimedMetadataGroup(
                items: [stillImageTimeMetadataItem()],
                timeRange: try asset.makeStillImageTimeRange(percent: stillImagePercent, inFrameCount: frameCount)))
        
        async let writingVideoFinished: Bool = withCheckedThrowingContinuation { continuation in
            Task {
                videoReader.startReading()
                var currentFrameCount = 0
                videoWriterInput.requestMediaDataWhenReady(on: DispatchQueue(label: "videoWriterInputQueue")) {
                    while videoWriterInput.isReadyForMoreMediaData {
                        if let sampleBuffer = videoReaderOutput.copyNextSampleBuffer()  {
                            currentFrameCount += 1
                            if let progress {
                                let progressValue = min(Float(currentFrameCount)/Float(frameCount), 1.0)
                                Task { @MainActor in
                                    progress(progressValue)
                                }
                            }
                            if !videoWriterInput.append(sampleBuffer) {
                                videoReader.cancelReading()
                                continuation.resume(throwing: LivePhotosAssembleError.writingVideoFailed)
                                return
                            }
                        } else {
                            videoWriterInput.markAsFinished()
                            continuation.resume(returning: true)
                            return
                        }
                    }
                }
            }
        }
        
        async let writingAudioFinished: Bool = withCheckedThrowingContinuation { continuation in
            Task {
                audioReader.startReading()
                audioWriterInput.requestMediaDataWhenReady(on: DispatchQueue(label: "audioWriterInputQueue")) {
                    while audioWriterInput.isReadyForMoreMediaData {
                        if let sampleBuffer = audioReaderOutput.copyNextSampleBuffer() {
                            if !audioWriterInput.append(sampleBuffer) {
                                audioReader.cancelReading()
                                continuation.resume(throwing: LivePhotosAssembleError.writingAudioFailed)
                                return
                            }
                        } else {
                            audioWriterInput.markAsFinished()
                            continuation.resume(returning: true)
                            return
                        }
                    }
                }
            }
        }
        
        await (_, _) = try (writingVideoFinished, writingAudioFinished)
        await assetWriter.finishWriting()
        return destinationURL
    }
    
    private func metadataItem(for identifier: String) -> AVMetadataItem {
        let item = AVMutableMetadataItem()
        item.keySpace = AVMetadataKeySpace.quickTimeMetadata // "mdta"
        item.dataType = "com.apple.metadata.datatype.UTF-8"
        item.key = AVMetadataKey.quickTimeMetadataKeyContentIdentifier as any NSCopying & NSObjectProtocol // "com.apple.quicktime.content.identifier"
        item.value = identifier as any NSCopying & NSObjectProtocol
        return item
    }
    
    private func stillImageTimeMetadataAdaptor() -> AVAssetWriterInputMetadataAdaptor {
        let quickTimeMetadataKeySpace = AVMetadataKeySpace.quickTimeMetadata.rawValue // "mdta"
        let stillImageTimeKey = "com.apple.quicktime.still-image-time"
        let spec: [NSString : Any] = [
            kCMMetadataFormatDescriptionMetadataSpecificationKey_Identifier as NSString : "\(quickTimeMetadataKeySpace)/\(stillImageTimeKey)",
            kCMMetadataFormatDescriptionMetadataSpecificationKey_DataType as NSString : kCMMetadataBaseDataType_SInt8]
        var desc : CMFormatDescription? = nil
        CMMetadataFormatDescriptionCreateWithMetadataSpecifications(
            allocator: kCFAllocatorDefault,
            metadataType: kCMMetadataFormatType_Boxed,
            metadataSpecifications: [spec] as CFArray,
            formatDescriptionOut: &desc)
        let input = AVAssetWriterInput(
            mediaType: .metadata,
            outputSettings: nil,
            sourceFormatHint: desc)
        return AVAssetWriterInputMetadataAdaptor(assetWriterInput: input)
    }
    
    private func stillImageTimeMetadataItem() -> AVMetadataItem {
        let item = AVMutableMetadataItem()
        item.key = "com.apple.quicktime.still-image-time" as any NSCopying & NSObjectProtocol
        item.keySpace = AVMetadataKeySpace.quickTimeMetadata // "mdta"
        item.value = 0 as any NSCopying & NSObjectProtocol
        item.dataType = kCMMetadataBaseDataType_SInt8 as String // "com.apple.metadata.datatype.int8"
        return item
    }
}

extension LivePhotosUtils {
    
    private func cachesDirectory() -> URL {
        return TemplateFileStorage.fileDirectory(directoryPath: ["livePhotos"])
    }
    
    private func assembleCachesDirectory() throws -> URL {
        if let cachesDirectoryURL = try? FileManager.default.url(for: .cachesDirectory, in: .userDomainMask, appropriateFor: nil, create: false) {
            let cachesDirectory = cachesDirectoryURL.appendingPathComponent("livePhotos", isDirectory: true)
            if !FileManager.default.fileExists(atPath: cachesDirectory.absoluteString) {
                try? FileManager.default.createDirectory(at: cachesDirectory, withIntermediateDirectories: true, attributes: nil)
            }
            return cachesDirectory
        }
        throw LivePhotosAssembleError.noCachesDirectory
    }
    
    public func clearAssembleCachesDirectory() {
        do {
            let cachesDirectory = try assembleCachesDirectory()
            let fileManager = FileManager.default
            let fileURLs = try fileManager.contentsOfDirectory(at: cachesDirectory, includingPropertiesForKeys: nil, options: .skipsHiddenFiles)
            
            for fileURL in fileURLs {
                try fileManager.removeItem(at: fileURL)
            }
            
            print("Successfully cleared the assembleCachesDirectory")
        } catch {
            print("Error clearing assembleCachesDirectory: \(error)")
        }
    }
}

直接贴源码了,里面 TemplateFileStorage 是我自己的实现,就为了存储临时文件,其他的应该可以直接用

@zkhCreator
Copy link

性能层面上没有很细致得查过,因为没有一些体感上的卡顿,就没管他了。
顺便多谢 给实现的建议,Kingfisher 实现没有仔细看过,之前就想到那里就写到哪里了。

@onevcat
Copy link
Owner

onevcat commented Nov 7, 2024

感谢感谢。我细查一下,如果能加的话,可能后续考虑为这个问题加一个 option,这样就不需要大家去自己实现一遍了。

@zkhCreator
Copy link

👌~

@HIIgor
Copy link

HIIgor commented Nov 15, 2024

image
弱网环境下,加载live貌似有较大概率会crash. 在线上上报了一些crash,我本地调试了下可以复现
环境:Xcode 16.0 iOS18.1 我通过charles开启了网络节流以后,出现了上图中的crash
图片描述

@onevcat
Copy link
Owner

onevcat commented Nov 15, 2024

@HIIgor 感谢汇报。

虽然可能不完全是对应这个crash stack,但是经过排查,发现了另一个可能会造成类似问题的情况。能麻烦试试看 fix/duplicated-completion-call 这个 branch 能不能重现么?

@HIIgor
Copy link

HIIgor commented Nov 18, 2024

@HIIgor 感谢汇报。

虽然可能不完全是对应这个crash stack,但是经过排查,发现了另一个可能会造成类似问题的情况。能麻烦试试看 fix/duplicated-completion-call 这个 branch 能不能重现么?

试过之后无法重现了.

@HIIgor
Copy link

HIIgor commented Nov 18, 2024

image
遇到另外一个问题,就是通过上面的代码成功下载了LivePhoto以后,通过cacheKey拿不到缓存中的image对象,不确定是不是我使用有误

@onevcat
Copy link
Owner

onevcat commented Nov 18, 2024

LivePhoto有点特殊,首先它完全不走 memory cache,所以你不能通过这个方法去获取。你需要memory获取的话,可以从 result 里拿一个 PHLivePhoto 的引用。

另外,如果是想从 disk 中拿源图片和视频的话,因为 PhotoKit 的关系,在 disk 存储的时候必须带 extension,需要强制指定一下。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants